Drill to Detail - Drill to Detail Ep.72 'Conversion Rate Optimization and Other CRAP' with Special Guest Bhav Patel
Episode Date: August 5, 2019In this final Drill to Detail Episode before we take a break for the summer, Mark Rittman is joined by Bhav Patel, founder of the London Conversion Rate, Optimization and Product Analytics Meetup to t...alk about Conversion Rate Optimization, Experimentation and A/B Testing, Customer vs. Product Analytics, Attribution and Personalization ... and the story behind the CRAP Meetups.CRAP Talks: CRO, Analytics and Product LondonCRAP Talks on SlackAnalytics, BigQuery, Looker and How I Became an Internet Meme for 48 Hours (CRAP Talks presentation)McDonald’s is acquiring Dynamic Yield to create a more customized drive-thru (TechCrunch)
Transcript
Discussion (0)
So welcome to Drill to Detail, and I'm your host, Mark Rittman.
I'm joined today by Bhav Patel, who I've known from the London analytics scene for a number of years now,
and he's currently head of conversion at Teletext Holidays. So Bhav, welcome to the show,
and it's great to have you on here. Thank you, Mark. It's nice to be here.
I was a bit nervous about doing this, but I'm excited, actually.
Yeah, it'll be good. It'll be good. It's great to have you here. And so, Bhav, I've known you
for a while now. I knew you from, I think, when you worked at Lab Brooks at one point,
and I knew you when I worked at Qubit.
But just tell us a little bit about, I suppose,
your route into what you're doing now
and what you've been doing the last few years, really.
Yeah.
So, as you mentioned,
I'm currently the head of conversion for Teletext Holidays.
I usually get two responses when I say I work for Teletext Holidays.
Who are they, which is common if you're under the age of 30, or they're still around if you're
over the age of 30. So they're a travel company. And I guess my route into this role, I didn't
traditionally start off in analytics and optimization. After graduation, I actually
fell into digital marketing. And a few years agency side allowed
was then followed by my first in-house role which I realized was probably the right path for me
and working for a company called Photobox and there I was I was kind of like heading up their
search and acquisition section session sorry area and that one that was an interesting role for me
because it's the first
time i got to see beyond the click so traditionally when you work agency side you you know you're very
much around the clicks that you know the ppc the ad copy but then you lose sight of what happens
to the customer once they make onto the website you don't have access to your your um your clients
um ga accounts or anything so moving client side i realized that the the level of information that's available is it's just astounding I just never saw myself
going back to agency side off that so photo books yeah well I was doing
acquisition and that incorporated they're still doing a lot of the
traditional marketing additional marketing but there was a lot of those a
high volume of experimentation that we're doing so landing page testing user
journey testing price price testing,
new customer versus returning customer testing,
followed by all the analysis that went with that.
And actually, I fell in love with that.
And by sheer chance, I got promoted up into the head of analytics role
over the head of acquisition role, which I was actually turned down
for probably the best decision that my boss ever made.
Then I did that for a few years
i just i absolutely fell in love with um ab testing experimentation and analytics i then
moved to ladbrokes for a year uh that was probably my favorite job ever just because the gaming
industry is so rich with data um but unfortunately i had to leave that just because there was a they
merged with garlic coral and uh they moved the department out into Gibraltar.
Uh,
then I did,
then I moved to news,
news UK for a short period.
And,
um,
I,
I,
I,
I,
I didn't quite,
I found that the,
the company probably wasn't the right fit for me,
or maybe I wasn't the right fit for them.
Uh,
so I got offered a role working in house at Mu,
which is,
uh,
I,
they do business cards.
Um,
it sounds,
it sounds really dull when I say it like that,
but actually they're a very quirky design first company.
I just, I love everything.
We use them. We use them. Yeah, we use them.
In fact, we used them after the fact you held that meeting at the Moo offices
and it got my interest and we got all our cards from there now
and they're very good.
Oh, I'd love to know the ROI on the crap talks and how much money they generated off the back of that crap event.
So I should look into that.
And then I did that for a while and I got offered a position a bit closer to home.
So my wife and I had our second child and the ability to work a bit closer to home meant that I could spend more time with the kids.
So I took the role.
It was a tough
decision leaving Moose. It was such a fantastic culture, such a great company to work for. But
I assumed I'll never get this time back. So I made the leap and went somewhere a bit closer
to home. So I'm now working for Teletext Holidays. And that brings me to where I am now.
Yeah, interesting. So there's a whole bunch. I mean, it's a very interesting story. And I think
it illustrates, I suppose, a different way in which you and I came into this industry. So there's a whole bunch. I mean, it's a very interesting story. And I think it illustrates a different way in which you and I came into this industry. So you worked in marketing, first of all, and you talk a lot out there the digital marketing um that was it made incredible use of analytics and data um that used very different terminology
and and used and in a way used different ways of analysis and uh and you made a lot of use of
things like stats and you mentioned ab testing as well and it kind of fascinated me how how data
driven and how analytical that world is and how it sensibly used similar tools to what I
was used to using but in a completely different way and and you know the terminology you used
and so on as well I mean again what so so you moved from digital marketing into analytics I
mean that was you know what was it what was it you say you fell in love with analytics what is it
what was your motivation around that I think it's the richness of data that was available when you
when you work on digital marketing,
you have access to ultimately one data source, two if you start branching out, two or three if
you start branching out to SEO and display advertising. But ultimately for me, my primary
data source was AdWords, which is now called Google Ads. And it was essentially the click
data, the average position of those keywords that you're bidding on um how
much they cost you and the revenue you know potentially if you're lucky um and the the
adwords account was connected to the google analysis account you've got access to the number
of orders and uh that came through and revenue so you know you're dealing in a very uh in a siloed
industry but actually a lot of what happens after the click will determine the impact on the click
so you know you could have the best keyword the best ad copy driving to the website if the website
experience is is terrible um you know it's it's gonna it's gonna make the the performance of your
ad keywords look bad so actually it was the what happened afterwards that i i just completely loved
so when a customer went onto the website how they interacted with the pages that they landed on,
the funnels that they were going through.
So being able to visualize that and see where the drop-offs happened,
where the opportunities to improve happened, you know,
it was that insight.
And actually the ability to influence that, which was great as well.
So one of the challenges I found as if I wear two hats,
my analytics hat and my experimentation optimization hat um if I for the moment if I take what the the optimization
one off just look at the analytics one you get to you know you get to see what's happening where
people are dropping off and where they're going but you don't have much control in in being able
to influence where they go next or prevent them going from going somewhere else and actually it
was that second hat that I went that I was able to put on which was which was part product manager
there wasn't quite a product manager role um because you're not touching an end-to-end
experience but you just you know you're making changes on the website to try and influence
behaviors that the ability to influence what happened on the website which i absolutely loved
a lot so i guess it was um you know double-ged sword, the data that came with moving in-house,
as well as the ability to influence how customers are interacting with the website.
Okay.
So there's a ton of stuff in there, and I'd like to go into some of these concepts as we go along.
But for somebody, imagine for somebody who isn't so familiar with in-house and digital marketing and all that,
tell us your role at Teletext Holidays,
you know, you're head of conversion there.
What does that mean?
What are you responsible for?
And what do you do kind of like
when you get in the morning on a Monday, for example?
Yeah, sure thing.
So I'll start off with the title itself,
head of conversion.
I'm not, we call it conversion
or optimization in the industry.
I'm not really a big fan of that term
because it implies that you're always optimizing towards the end goal, which is revenue or orders. Actually,
I prefer the term experimentation because what you're doing is you're experimenting on different
aspects of the web. If I could change my job title to head of experiments, I probably would,
but I might have a board with my boss later, actually.
But essentially what I do is on a Monday morning, I come in.
First thing I do is I check the performance of the previous week in terms of traffic,
the overall conversion rate, so how many people called through to our call center to make a booking.
Were there any days where we were down?
Were there any issues I could flag down were they up um any issues i
could flag or highlight actually this isn't just a monday morning i do this actually um most days
yeah that's yeah fair enough um and then once i've kind of like got an idea once i've got an
idea of the you know the the time period that we're in if things look looks okay i then hop
into um our ab testing tool and i check the expert this
you know the state of the experiments that are running so i know there's a bit there's a bit of a
joke that goes on the cro industry about peaking at your tests and you're not really supposed to
peek at the results of your experiments just because um you know you might introduce some
bias you might draw conclusions too early i would say that the person who's running the experiments, provided they have an unbiased view on the world, it's okay for them to check. It's really the stakeholders and the people who are potentially, it's going to impact their area and their numbers. They're not allowed to check, to peek at the results because they're more likely to jump to rash conclusions or whatnot. So essentially, I check in on each one of my experiments
and just make sure that they're running okay,
essentially just make sure nothing's gone wrong.
Then it's followed by a catch-up with the rest of the team.
So we have a weekly conversion meeting on a Monday,
which involves going through the experiments that we have live,
any experiments that are coming up, any experiments that are in detail in development,
and what stages and when we can roughly look to release those experiments into production.
So I guess that's fundamentally the role. Obviously, you have things like
your weekly trading meetings where you're just you know just analyzing the state of the business but if i just look at my role in isolation that's
essentially uh uh how how it is then you know then it will be ad hoc analysis trying to find
other areas of the site and um other areas of the site that i can try okay okay so so again somebody
new to this would would be you know people most people familiar with the idea of things like Google Analytics.
So going in there and seeing how many page views you've had or seeing how many kind of, I suppose, unique visitors you've had.
But you've been talking now about experiments.
And we mentioned A-B testing and so on.
I mean, again, just as sort of layman's, if you're going to do a layman's introduction into what you think, what are you referring to as experiments?
And what are you trying to gain by that? what's the purpose of that really yeah um so it's a good
question i think if i was to put it in a nutshell everyone everyone's aware of clinical trials or
they you know to a certain extent you know you you have um the pharmaceutical industries who will
have a set of patients with identical features or similar features,
you know, be it age, gender, nationality, all those things. And they'll divide those people
into two groups. And usually this is done through randomization. So by randomizing your,
you know, the audience that you have, you're less likely to introduce any biases into it.
So you take them and then you take, you know, you take your audience and then you give one of them an experimental drug
and you give the other one usually a placebo. And then you measure the impact of the experimental
drug against the placebo. And you see if there's been an improvement in conditions, in health,
whatever the criteria is that you're looking at. In a similar way, we do the same thing with
websites.
We get customers, visitors, users, whatever you want to call them.
They come to the website, and usually through some type of A-B testing tool,
we randomize the traffic.
Now, if a customer fits the criteria of the experiment that's going on,
it could be that the person is on a mobile device.
They're on a particular browser. They've come at a particular time of day, or they're looking at a particular set of products.
Once those criteria are met, the customers will essentially be split into two groups or three groups.
But let's say for the sake of arguments, two groups.
One group will be shown a control version of the website.
So let's say we're talking about a product page.
That's a control group.
They'll see the page as it is, as it was a week ago.
And then you've got the second group,
which is the experimental group or the variation.
And they will see a modification of that page.
That could be anything from an image.
It could be a different price.
It could be a different layout.
It could be a whole bunch of things.
And you then measure the impact of the change you've made against that control group to see if there's an uplift in the metric you're trying to measure.
So the metric could be a conversion.
It could be average order value.
It could be the ability to add on additional products into the cart, so accessories.
It could even just be making it through into the next page. And then once you
run that for a certain amount of time, this is where the statistics comes in. You run that for
a certain amount of time and you analyze the data and you try to see if that data, if the differences
you're seeing, if you're on the surface, you won't get an exact number, you'll see slight variations,
but it's the role of the statistics is to determine if that variation is significant or if it's random so ultimately that's that you know i hope that makes sense
it does in my head yeah so so something that struck me when i when i in the end i was responsible for
some products in this area and what struck me as interesting was the degree to which stats are
important on this and and you mentioned stats there um so so um how much uh do you have to be
a trained statistician to do this and how much is it important to understand stats and probability
and so on in this in your role i think anyone that claims to be working in the field of
experimentation and optimization should have a at least you know, a high school or A-level level of statistical understanding.
I think without that, you know, you're likely to draw incorrect conclusions. You might,
you know, you might look at the data and, you know, see an outcome that doesn't truly exist.
And this is where the statistics is important. Statistics isn't actually telling you if it's guaranteed. It's just trying to say to what degree of accuracy the difference you're seeing in the data exists.
So because I'm quite connected to the CRO industry, I see a lot of blog posts and I read a lot of stuff and maybe go to quite a few talks. I've realized that there are two types of experiments,
if you want to call them optimizers out there.
There is the people who can come up with the test ideas.
They're great at finding things to test on.
And then there's the people who can analyze that
and draw statistical conclusions.
And I rarely meet someone who is in the middle um i i my i kind of
like strive to be in that middle but i would probably say i'm slightly more towards the
statistics side i'm probably closer to the middle um then then i i will let myself believe but um
i i think it usually swings one way or the other um and you've got the data science side of you've
got the data scientists who are on that right hand side of the other. And you've got the data science side of, you've got the data scientists
who are on that right-hand side of the spectrum,
which are, you know, they're very analytical.
You know, they can look at frequentist methods.
They'll look at Bayesian statistics.
You know, they'll do all the fancy data science stuff.
And then you've got the guys on the other side
who are great at seeing the problems
that users are facing
and coming up with ideas to test on it.
So I think that
middle ground is where everyone should be aiming to be so the data scientists and analysts and
um the statisticians should be you know starting trying to move a bit more towards the left and
towards the middle and the guys who on the left hand side the product managers the marketing
managers the you know user researchers and user designers,
they should be moving a bit more to the right towards that middle.
So one of the challenges I have when I was – so we used to run A-B tests
and we'd get a set of results, we'd get an uplift, we'd get a whatever,
and then we'd try and understand why something happened.
So it might be that we got less of an uplift than we expected or –
how easy or what process do you use to try and work
out well what why did it happen and in general kind of making sense i suppose really the results
and you know trying to work out segments from that and so on what's your thoughts on that
so i think you can do you can go about doing this in one of two ways you can you can do some type of
user research testing so say you've you know you've run a test and the hypothesis for that test was sound,
but the results and the outcome of that test
didn't quite match what you were expecting to happen.
I think the best thing to do
is go and look at it qualitatively,
going out and speak to your customers,
doing the whole user research side of things.
Alternatively, you can do what I do
because we don't have access to a user research team.
I try to anecdotally try
and understand what happened so i'll give you an example and we recently ran a test which by all
accounts should have performed well you know the data suggested that it should have performed well
um the heat mapping software that we were using you know suggested that this should have worked
well but it didn't work it did not perform well at all. And it, I actually had to call it early because it was doing so bad. And I, I normally shouldn't call
tests early, but I'm of the sound mind. And, you know, I think you can call these types of tests.
We're not an Amazon. We don't have unlimited sets of traffic. So, you know, making a best judgment
call. But the reason it didn't perform well was because we'd overlooked the fact that,
so the test itself, we were testing our hotels from price from low to high.
And even though data suggested customers were looking at that, when I ran the test and made
the default sort option low to high, I found that the test didn't perform well. So I won't give any
numbers away, but the reason it didn't perform well. So I won't give any numbers away, but the
reason it didn't perform well actually was actually having spoken to our product, our physical product
manager. So the guy who goes out and makes deals with hotels and all those companies, you know,
all the holiday sellers, that the hotels that were priced really low, we just didn't have good
content for them. So what we were showing is by default, we just didn't have good content for them.
So what we were showing is by default, we were showing a whole bunch of hotels, which lacked decent imagery, which lacked decent content. And because the sort option is kind
of like tucked away in our search results page, that, you know, I can't validate that,
but that was the best conclusion I could come to as to why this test by all accounts you know that failed uh when all accounts showed that it should have performed well so even though
customers were interacting with the feature you know they were they were sorting themselves
showing them the default didn't work because we just lacked the um the content to be able to
produce a good search results page okay so you mean you're talking you know we've been talking
about the idea of running a single ab you A-B test where there's kind of two
options there and running for a period of time and so on.
But obviously, one thing I picked up on from, and we'll get into it later on about the crap
talks you organize, but the industry in general now typically talks about maybe running multivariate
tests or doing more than just kind of trying to do one thing at one time and that's it.
What does state of the art look like? Or what are the the current kind of i suppose uh good examples you're seeing of
testing and experimentation being used um to you know drive growth in in businesses
um i don't know if i don't know how many people do multivariant testing um i i guess for me because
i think for most companies i and that i've worked worked with and a lot of the people I speak to,
we work for companies which don't have access to unlimited traffic sources, which means that we have to be, I don't say selective.
We can test a whole bunch of things. But if we start running multivariant testing, that takes a lot longer to validate. It takes a lot longer to come to any type of statistical conclusion
and confidence just because you're testing multiple variations,
so A and B, A and C, B and C, and A and C.
So it takes a lot longer to validate those type of multivariant tests.
So actually, a lot of the people I know are currently just doing A-B testing.
Multivariant testing, as far as i know is is
probably reserved for the the big boys for lack of a better word right um so so i mean i mean i
know i knew you again from when i worked at qubit and qubit as you obviously were in the idea of
personalization and and actually an argument there would be that ab testing is is something that
has i suppose a limited amount of runway it can go down. And the true answer is to go towards personalization.
I mean, what's your, putting Qubit aside and just looking in general,
how far testing can go?
And what's your view on that?
Is it something where the returns on it are limited?
Or is it a stage in the evolution process towards things like personalization?
What's your thoughts on that?
I'm actually going to quote someone who probably gave the best example of personalization i could probably give and um imagine you go to a restaurant
and uh you see you know the waiter takes you to your table and you sit down and um you know you
sat down you've already got your knives your plate your glass selected for you and the only thing
that they get that you know they might ask you to choose is um the food and then once you've
chosen food they'll make a recommendation on the wine but they you know they might ask you to choose is um the food and then once you've chosen
food they'll make a recommendation on the wine but they you know they haven't actually tried to
personalize the basic experience for you which is you know which fork do you want which spoon do you
want what type of glass do you want you know they there's a whole bunch of all of that's been
pre-chosen i think i think there's a myth that personalization is the ultimate goal for every
company i don't think it's right for every company um I think if you're a company which has a high volume of return traffic,
or you're a company who, you know, you record data and you're creating content on behalf of
customers. So for example, I used to work at Photobox. Photobox is a great example where you
can make recommendations and you can really personalize the experience. Ladbrokes was
another example. When I was working at Ladbrokes, we actually, we built a proof of concept recommendation engine
because the frequency of which customers were returning was so high that we had so much data
on one person that it made sense that we could actually try and predict what they might be
interested in betting on next. Whereas if I look at when I was at Moo, our return rates are quite low. So even
though there's an element of personalization in there, for example, like Photobox, the frequency
of which customers come back to the website is just so low. And then on top of that are the
depth of products. So whilst we have multiple different types of products within our range,
for example, you can have a square business card,
a thick business card, a glossy business card,
a matte business card.
It's still just a business card.
So personalization in that doesn't make sense.
I mean, maybe once you've run out of everything to do,
I imagine personalization would probably be the next thing.
That's what I would have done.
If I was to let go,
personalization would probably be right at the end of my roadmap.
There were so many more things that we would, we,
we were testing from pricing to imagery to experience and,
and recommendations in terms of kind of like the, the finish on your card.
So you've already built your product. You know, you might say, Hey,
have you tried the rounded corners or the square corners of that finish?
I think that made a lot more sense, but trying to personalize, you know,
someone's buying business cards, trying to personalize a homepage for them, which shows them notebooks and postcards.
It just didn't make sense, much like travel, actually. So we are, you know, we've kicked off some basic personalization, which is just, you know, your last viewed hotels and last viewed searches but it's it's probably not a um something you know
we'll focus on right away to go down to the nth degree so you know i think personalization comes
down to a one to many one to few and then a one-to-one approach i think for most companies
a one-to-few approach is okay where you analyze a whole bunch of customers and you show them
roughly the same thing as opposed to a true one so what did you think when you did you hear when
obviously dynamic yield were acquired by mcdonald's and that was interesting
wasn't it because dynamic yield were a big player in that personalization market and and yet and
they were bought by mcdonald's who presumably making quite a big kind of quite a big play in
that area was that something that you you noticed at the time yeah actually i was surprised when i
saw that um and i but i haven't heard anything since. I don't know, maybe I've missed the memo or something.
But I imagine, I don't
know if McDonald's
purchased Dynamic Yield
for the purposes of integrating
the McDonald's
side of the business with personalization. I'm sure they'll use it
to a certain extent, especially, you know, when
McDonald's are doing their, you know, their
monopoly deal that they do, their seasonality
stuff, as opposed to their day-to-day BAU stuff i imagine they'll probably do it for something like use it
for something like that but it just makes sense to buy um a person a company that does personalization
because it's i'm sure it's profitable i mean google just bought um oh god who did google just
buy and then oh look at look at look at yeah they just bought looker and then um there was
another big acquisition salesforce just acquired another big company and tableau tableau that's
right and then um i think content square and usability just merged so i you know i i i guess
i you know you see this stuff happening all over the place i guess it's just good business sense
as opposed to you know real integration i don't know i'm probably the worst person to ask about why mcdonald's purchased a uh personalization company i can't i can't see it
being a um the sole purpose of that purchase being to personalize the mcdonald's experience
especially as as unless they suddenly start changing their food which is uh you know getting
really really healthy so smith wanted to say to you, so you mentioned that the thing
that really interested you in this area
was working out what happened
after somebody kind of arrived at the site.
You know, so after the click, you said that.
And so what are the, I mean,
so to me that says trying to work out
what somebody wanted to do,
what they did do after they landed on the site,
how they went through that,
how they interacted with stuff.
I mean, how did you go about doing that?
And what are some of the challenges or ways you do that and and how does that kind of work really
yes i think you're um you're probably now moving more towards the topic of product analytics and
so i find product analytics is i i don't think it's an established field as much as customer
analytics is or you know all the finance side of the business because what you're doing is you're
when you're talking about say let's take a really basic example
of the finances of the business.
You know exactly how much money you generated,
how much of that went to, was profit,
how much of that will cost.
You do your EBITDA calculations
and you can project your one year,
two year, three year costs.
It's fairly straightforward.
There's usually very little gray area.
Customer analytics is, I guess it's sort of similar. You know which customers purchased, which ones
didn't. You know what they purchased, when the last time they purchased was. And you can calculate
something like customer lifetime value. These type of metrics exist. There are pre-built functions
into things like R and Python to be able to do these type of calculations. So I feel like things like customer analytics, you know,
for the finance side of things, you know, they're mature areas of data analysis. Now, product
analysis for me is, it's still relatively new, even though loads of people have been doing Google
Analytics for quite some time, it's been around forever. But, you know, the rise of the product
analyst, I think the product analyst um i i think the
product analyst needs to be very different to a traditional customer analyst or a um or a financial
analyst the reason being the product analyst needs to understand the business that they're working in
as you know they need to understand the customer they need to be you know they need to understand
the experience on the website a whole bunch of things uh they need to understand how we track
how we collect data um how events are fired. Whereas from a customer analytics perspective,
you have your name, your gender, your date of birth, your address, your purchase history.
And you can probably live with that within a ring-fenced amount of data. Whereas the product
analyst, depending on the company you work for that could be widely
different you could be tracking everything from you know the number of bets you place when you're
out of you know say you're a gaming website to um you know the number of uh you know the calls you
make to a call center to book a holiday which is kind of like why i'm in right now and the you know
the events and the experience you have you that on the site is captured very differently, and you have to try and quantify that experience.
So you get customer research tools out there, which they do heat maps and they do all sorts of things.
But when you're looking at it from a pure quantitative perspective and you're trying to understand X number of people made it to a particular section of the site
and Y percent made it to another section,
then you want to understand why isn't it the same rate.
It could be pricing.
It could be the imagery you show.
It could be the experience.
Something's not working.
It could just be that the data isn't collecting properly
and you've potentially not tagged something properly so
you're unable to analyze it so i i feel like product analytics is it's still very much in
its infancy i you know i don't think um there is a a bible on how to do product analytics because
it's so unique company to company um and you know you can have the skill sets but it takes time to understand the business
how they collect data and what to look for yeah i mean so so when the light bulb went off in my
head actually was was uh i interviewed uh yali sassoon from uh snowplow um on on here a while
ago and um and and he made he made a comment which i think was it was illuminated it for me and he
said you know the thing about think about doing analytics on things like transactions and
revenue and customers and so on is you're generally averaging, you're kind of looking
at trends, you're looking at fairly, you know, it's a fairly straightforward calculation
you're doing.
Whereas when you're doing product analytics, you're trying to understand what was somebody
doing?
What were they intending to do?
And did they manage to do it?
You know, and you're looking at a different type of analysis.
You mentioned event level analysis there, but it's a lot of it is about you know cohorting and and understanding intentions and segmenting
and so on is that something that you you know you see as well and is that these are techniques
that you use or you've seen used yeah and i think this comes down to how creative you are as an
analyst so um i i wrote a blog post a while ago about it's around how an analyst needs to have, sorry, as someone who does conversion optimization, they can't just have a process, they need to have a mindset in terms of the creativity that flows.
And this comes down to, you know, if you can't put yourselves in the shoes of a customer, you're never going to be able to analyze what they're doing.
And you may not get it 100% right, but if you're analyzing a large enough data set,
you can maybe draw some conclusions or at least point your user research team in the
right area to be able to say, hey, can you go look into this part of the website, which
customers arrived to it and they should have been doing this, but they're not.
So they weren't successfully able to do it.
Why do we think that is?
And sometimes you don't even need to go into a whole piece of user research
you can um you can pick it out quite quickly just by adding on a heat map on top of it
or you can um you know you can have someone in your company just go and play with that for that
part of the funnel with a you know completely unbiased view and just you know ask them to
feedback so there's this there are ways to do it but i think it comes down to as a product analyst needs to be creative i
don't think the product analysts and the customer analysts are cut the same same cloth it's
interesting interesting interesting so so a topic that i keep hearing when i go and see my customers
now is is number number one number one challenge for them is is around um acquisition and looking
at their acquisition funnel and trying to understand where the marketing money is being spent and which of the channels is most effective.
And a lot of this comes down to the kind of attribution, really.
And you mentioned earlier on about, you know, you get data from, say, Google Ads, you know, trying to link that maybe to kind of individual IDs and so on.
I mean, maybe just explain again, what is attribution and what is the problem space this is talking about, really, do you think?
So I think attribution, it's something we're trying to solve attribution is when you come to a website there's a there's a strong possibility that you took a whole bunch
of steps uh before you got there so sorry rather before you make a purchase or you book a holiday
or you place a bet or you know whatever it is you what your company does before you do that there's
probably a whole bunch of channels
that you as a customer has touched.
So it could have been you've seen a Facebook ad
or you've seen a display banner somewhere
or maybe you did a generic PPC search.
So you've just searched for something like holiday to France,
whatever it might be.
And then you've done a few things.
And then finally you come back via the brand search term, then, you know, you've done a few things. And then finally, you come back via the
brand search term, say, for example, or you directly go into your browser and you type in the URL of
the company you're working for, or you're looking for rather. And, you know, you make that eventual
transaction. Now, on a classical last click model, you would attribute all of the weight of that
transaction to that last click.
But actually, it ignores a whole bunch of things that happened before that.
So what attribution does is it tries to give credit to each one of those channels.
And it does it, you know, you can have different attribution models.
You can have first click, last click, evenly weighted, or sort of like decaying. So your first click gets the least
and your last click gets the most, but then there's a kind of curve somewhere in between that.
So attribution is just trying to solve that problem. But again, I think, I'm a skeptic. I
think a lot of companies are trying to do attribution when they don't need to be.
I think attribution is only really valid for companies which have a
considered purchase that's more than maybe a week, say, that's an arbitrary timeline.
But for example, I think attribution for a holiday company works really well. If you think about the
way you as a customer would book a holiday, you're not going to go to your first website
and book. You're going to shop around. You're going to do some price comparisons. You're going
to look at dates, destinations, temperature. You're going to do a
whole bunch of searches. You're going to talk to your friends. You're going to do a whole bunch
of things. Whereas if you're looking to buy, say, for example, I don't know, a book. I can't think
of anything better. You're probably not going to deliberate on the decision to buy a book too much. You're probably just going to go. If the price is okay, you're probably not going to deliberate on it on the decision to buy a book too much you're probably just going to go uh if the price is okay you're
probably just going to buy it you know you don't need a thousand reminders uh and emails say to you
know for that book you probably either know if you're going to buy it or not uh or you do what
i do on amazon where i add it to my wish list and pray that amazon drops the price on it and then
then you buy if it's a bit out of my price range or you wait for a Black Friday sale,
you know, something along those lines.
So I think attribution for me is,
it's a dirty word
that a lot of people have just made
because it's sexy now.
You know, there's a whole bunch of that,
those sexy words floating around
like machine learning, AI,
all of those things that,
they don't mean anything.
People just throw them around
to make themselves
sound important but i think to me attribution is it's time and place just like personalization
i don't think personalization is for everyone i don't think attribution is for everyone
so so one i suppose one of the themes that's been in the industry last couple years is around
privacy and you know gdpr and things like there's been changes with, you know, tracking that Apple are doing and so
on. I mean, in your observations, in your experience, how much are people's awareness
of privacy and maybe, I don't know, the rules like GDPR affecting things? And does it mean
you can't do your job? How does it change things really? So I think GDPR only really spikes when
there's a news article. So the impact of GDPR only really spikes
when there's something in the news.
So I remember when GDPR came out, I was working at Moo.
For the first sort of like one month,
we had a huge spike in customers
who were requesting their data be deleted
or requesting access to their data.
But by month two, that number sort of like,
you know, exponentially decayed down
to almost less than half.
And then by month three, it was half again.
I think by month four or five, we maybe got like 10 requests a month from customers.
So I don't think GDPR is – obviously, if you have a breach in GDPR, you've been hacked or something like that,
that's a very different story.
And I think GDPR did the right thing.
It prevented all these black hat
techniques for getting customers to sign up
to your website, double negative.
So do you agree to not
have, you know, all that?
I absolutely used to hate those sort of like, we call
them in the industry, we call them dark patterns.
And I'm sort of glad that GDPR helped, you know, weed out some of those trashy techniques.
Because, you know, essentially, you know, their techniques reserved for the growth hackers, you know, the people who are trying to get some fast numbers in through the door by using unethical means.
So I think GDPR was is it's great
for the population i don't think it impacts most people it just helps come keep companies and stay
uh you know stay above the water and stay honest but i don't think you know if i if i think about
my wife who does not work in digital she has you know she i mentioned the word gdpr to her
i don't think she had a clue what's all about yes interesting so so i mean
i suppose last question on this topic i mean um so so would you would you say that experimentation
and that kind of thing is something that customers do when they're very mature or is it something you
would do from day one of a product company i mean where does it fit in in terms of the life cycle of
a business really in your view i think you can do it right from the beginning so i've so i i'm going
to answer this question um with the caveat that I've only worked at established companies, right? So my answer
on this is going to be slightly biased. But I think all companies can do experimentation and
testing right from the word go. I don't think experimentation should be used to block actions
being taken. So if you come to a point where every
decision you make needs to go through an experiment i think that's probably wrong especially early
stages where you know you want to get products out there you want to get them live you don't have
enough traffic so you just want to put it out there and you know just be a little bit bold
but i think for you know when you're dealing with large-scale companies that you know they
they definitely should be experimenting and trying different things um and i think the statistics will vary
depending on which stage of the company you know your company's at so if you know if they're if
you decide to do experimentation at early stage which you should um you probably shouldn't test
everything but certainly some like the core functionality of your of your product or your
website you you know you'll have a much smaller sample size which means you're gonna be shooting
a bit more in the dark than you would if you were at a company which had loads and loads of traffic,
so where you can draw statistical significance and you have higher confidence levels and your power values are like 80%,
and you know that your chances of seeing false positives and false negatives are going to be slightly lower.
So if you're at an early stage, you definitely should be testing.
You should be testing your pricing.
You should be testing your features. You should be testing you know you should be testing your pricing you should be testing your you know your features you should be testing the flow but do it with the caveat that
you know you don't wait for that 95 significance uh you know if it doesn't move the needle
a lot then you know just don't wait it out just make the change go ahead with it but i think
companies will uh like like well and certainly with uh teletext we you know we're still not the
amazons and the Netflixes of the world.
So we do have finite traffic.
But you can wait a few extra weeks to get that a bit more.
I don't want to use the word certainty.
So quotation certainty in the change you're about to make.
But I think generally everyone should be doing testing.
And the degrees of the testing you do will depend on where you're from.
Okay.
So, again, another reason I wanted to talk to you was you run these fantastic set of meetups called the Crap Talks.
So tell us about that name and how it links into what you're doing now, really.
So just give us a bit of an overview of what that is and your kind of motivation behind it.
So Crap Talks of two from two reasons one i was extremely bored because um
we were i was transitioning from one job to another there wasn't much work happening because
this is um when i was at ladbrokes we'd just been sold we were moving the uh out to gibraltar so i
didn't have really much work to go but i was going to a lot of me um i don't want to say meetups so
actually i was going to a lot of conferences and I absolutely hated them.
So even when I had nothing to do,
taking a day out of the office
or half day out of the office was,
you know, it's a huge constraint on someone's time
and it's a huge amount of their resources
that they're giving up to go to your conference.
And what I found was that most event organizers,
they didn't respect me as the attendee.
They had the utmost respect for the speakers,
you know, that they would be, it was their time they valued.
And they forgot that actually without the audience, there is no event.
And so what I did is, you know, I had some time to kill.
So I thought, you know, and also the other, the events I was going to were very much siloed.
Now, if you think about conversion optimization analytics product development,
which is also the acronym.
So CRAB stands for Conversion Rate Analytics Product.
There is a nice Venn diagram somewhere where the three of those come together.
You know, you build something, you test it.
Sorry, you build something, you analyze, you test, you build.
Or you analyze something, you test, you build.
Or you test something, you analyze, you build. areas are so interlinked that I found it really strange that there was
only product-related events
or analytics-related events or
at the time there was hardly any conversion-related events.
I think Crap
was one of the first ones, certainly
in London that was very much conversion-focused.
So I decided to
put one in the diary and hope for the best.
The first event,
it wasn't called Crap initially, it was called London Conversionary Optimization Analytics
and Product Meetup, or one conference,
or I can't remember exactly what I called it.
But it was a real mouthful.
It was actually a friend of mine who says,
Bab, why don't you drop the O in conversion optimization
and then just squish it all together and make the word CRAP?
And I was like, holy shit, that's, sorry, excuse my language.
That's genius.
So I did, and that's how Crap was born.
So it was actually Crap 2.
The first one was actually just London conversion optimization
on the 6-bit product.
And the first event we ran, there was like 10, I think 12 people
that came.
Qubit loaned us their office space downstairs,
which was nice and intimate.
But we had so much fun.
We talked.
It was very conversational
it wasn't speaking at the audience
I mean it was easier because there was only 10 people there
10-12 people there
but we had so much fun we thought let's do this again
so I did it again and the next time around
I think something like 20 people turned up
and I started
the first one was just
whoever I could get to speak
but after the first one, the second one, the third one I started, the first, honestly, the first one was just, you know, whoever I could get to speak. But after the first one, the second one, the third one, I started shaping it around what I initially, you know, the reasons why I initially started Crabb,
which was the unqualified speakers who were talking at these big events or the, you know, the mismatch of audience and speaker
or the very salesy techniques that these conferences were having.
So the other thing was all the conferences I was going to were very salesy. I had to walk past
something like 10 or 15 stalls before I even made it into the auditorium to hear the talk.
And then you had vendors giving away a whole bunch of merchandising. And it was just,
I think I took away more bags of merchandise than I took away ideas and and and things I could apply into my
day-to-day life and I realized that was that's that was fundamentally the problem so I started
crap and um you know we're about to do our 14th one and uh you know it's it's almost two and a
half almost three years later so and here we are so it's a community now of almost 2,000 people in
London um with a couple of chapters in Manchester,
Istanbul, and as of late,
Berlin. Yeah, definitely. I mean, there is
the conference kind of military-industrial
complex, isn't there, in some respect, where
there's a whole,
I know this from my Oracle days, there's a whole kind of
world of conferences that
is supported by sponsors, that has
all these stands, as you say, and they have the same
speakers going around, and then you fly from country to country and you do this
and often you might get a vendor sponsorship.
And it's actually quite a nice lifestyle for the speakers.
But, you know, and I'm not saying that these things are done
not considering the attendees,
but I've found the meetup scene that I've encountered,
obviously, now in this world and with Looker and so on, really interesting.
They're much more intimate.
They're much more, you know, smaller space.
The speakers aren't quite so kind of hallowed,
and it's typically the audience half the time standing up and speaking.
I find them really good.
And the fact they're in the evenings means people can go to them.
The fact they're in the evenings means people who go to them
actually have got a real job to do.
You know, often when you run events during the day, by definition,
it's people who haven't got anything to do during the day that come to those events.
And so it can be an interesting kind of audience sometimes as well.
But I've seen, obviously, I noticed that the no commercial side of it was important for you.
And I always kind of think that's quite good because it kind of sets the scene.
And also, I typically say no commercial, recruitment you know or at least not overtly
really you know because what you don't want is is the place full of kind of recruitment consultants
as well um and and it works well and i think i spoke at one of your events and it was good and
i enjoyed that and uh but the but the i think it was one of the funniest talks we've ever had i
think it's about my kettle wasn't it i think yeah that was the one that's what i spoke about the
i scraped the daily mail comments and what was funny was the daily
mail were actually in the audience and when i left there i went to go to the lifts uh the two there
were two journalists in the lift with me as well which was for the daily for the daily mail which
was which was interesting um and slightly awkward because i had no idea yeah they were yeah yeah so
it was interesting um no but your point you know, your point around the sales stuff is, you know, it's absolutely right.
Especially after work, you know, you've just finished a long day.
And, you know, going to conferences during the day is not, you know, it's usually not because you don't have anything to do.
Maybe for some people, you know, if you're at a certain level in your career, you know, you're probably just sitting around checking emails about half the time but um for a lot of people especially like myself it's a day out of the office you know it's not
something i i want to do because i know i'll come back to a mountain mountain of emails or
or i'll be behind on something but i'm hope i'm going with the ambition that i'm going to take
away something that i think to myself wow that was you know that was a great talk why didn't i think
of that or i could try that or maybe you, we could do something differently. But I found more often than not that that wasn't the case.
And actually, a day out of the office, it was stressful.
It was uncomfortable.
I don't really like networking.
You know, it's just uncomfortable.
So, you know, being in the evening, crap talks, you know, you've just spent a hard day at work.
The last thing you want is to be cornered by, you know know someone who's trying to sell you their product or recruit you um and actually you know we actually have sponsors who are recruiter
so pivotal london um user biller who do user research surveys and um and a cro agency um
called uh creative cx and and those guys they actually started off as attendees and they loved
the the the event so much they offered to to sponsor and at the time
i wasn't really doing any sponsorship but it was starting to grow and i was paying for drinks and
everything out of my own pocket that it the the help was appreciated but they were so you know
they're really respectful they actually stayed within the boundaries of what i'm trying to do
and that you know they they've not been allowed to talk they support the event but they don't
actually put up banners they don't put up you know they don't try to sell their services or something you know so they understand the the values and
the mission and actually they're they're those guys are part of the reason i've been you know
able to grow into certain areas into these different locations so it's so so tell us about
that then so you you've now you've now kind of you say you've got different chapters in other
countries how's that working and what's your involvement with those?
I'm not there, which is sad.
I'd love to be in Istanbul every time there's a crap event or in Manchester or the Berlin one that we're going to do in September.
I don't know if you know, but we have a Crap Talk Slack community.
Yeah, I'm on it we have, so I don't know if you know, but we have a crap talk Slack community. So. Actually,
yeah,
I'm on it.
Yeah.
Oh,
amazing.
Okay.
You've got a job to do.
I've got,
you're very quiet there,
Mark.
I think you're just a lurker.
You,
you know,
you're probably just sitting around reading.
I'm joking.
Fair enough.
Okay.
But yeah,
so like we have the,
we have the Slack community,
which helps me stay in contact with the guys in Manchester, with the guys in Istanbul, and Scott, who's helping me run in Berlin.
And we just talk in there and what's going on, if they need any support.
I sort of have a whole bunch of rules and I don't want to say onboarding documents, brand guidelines afterwards.
So to make sure that those guys uh sort of like live
and breathe the crap value so you know like i did the whole thing of treating it like a proper
company so we have a mission statement a vision statement some rules um so you know so i manage
it remotely i think this this is the beauty of the internet and and digital age as i can you can
you can start something in one location and expand it out to different parts of the world
and the country
without ever having to physically be there.
So that's actually how I've just grown it.
Fantastic. Well, look, I mean, so recording this sort of middle of July,
I think the next meeting, when is the next meeting?
It's going to be going out after that event, but when is the next one?
Just so people are aware of the cadence of it.
Yeah, so the next one is at the end of July, so 31st of July.
So if you are listening to this um so if you're if you
are listening to this podcast which you're most likely to be um after the event you can always
just go onto the website because we do record all of our events now i'm sorry we actually started
this after your um talk we were like you know we had such good speakers that we were like why aren't
we capturing this so we actually have a video section so if you are listening to this um after the event is finished
good excellent so how do people find out about the crap talk the crap meetups then and uh and
and i suppose find your details on the internet and that sort of thing um is am i allowed to pitch
myself and you can yes yes yeah yeah i was i was trying to respect your rules of this you know
we're not selling anything so technically crap is free so i'm not selling it uh but you can just go to website craptalks.com where you'll find all of
the content um we have a youtube channel so if you just search crap talks you can uh find us there
or you can find me on twitter with the um the handle dodo nerd do do any auditors so
uh that's just uh don't ask about the alias it's something that i created
last 21 yeah i just haven't changed it um so yeah they're probably the three best channels
okay fantastic well look it's been great speaking to you we appreciate you coming on and speaking
yeah coming on the show and uh yeah i mean interesting insights there into the industry
and uh and and um i'll try and come along to your next event as well so it's uh yeah it's
brilliant nice to speak to you thanks mark it's been a lot of fun um
if you know if anyone has any questions just feel free to put them my way Thank you.