Heads In Beds Show - Data Driven vs Gut Feel Marketing: What Actually Works Better?
Episode Date: October 2, 2024In this episode Conrad and Paul debate when to use 'data driven' approaches only to making marketing investments and when to ignore the data and 'trust your gut'. Enjoy!⭐️ Links & Sh...ow NotesPaul Manzey Conrad O'ConnellConrad's Book: Mastering Vacation Rental MarketingConrad's Course: Mastering Vacation Rental Marketing 101🔗 Connect With BuildUp BookingsWebsiteFacebook PageInstagramTwitter🚀 About BuildUp BookingsBuildUp Bookings is a team of creative, problem solvers made to drive you more traffic, direct bookings and results for your accommodations brand. Reach out to us for help on search, social and email marketing for your vacation rental brand.
Transcript
Discussion (0)
Welcome to the Head to Med Show presented by Buildup Bookings.
We teach you how to get more vacation properties, earn more revenue per property, master marketing,
and increase your occupancy.
Take your vacation rental marketing game to the next level by listening in.
I'm your co-host, Conrad.
And I'm your co-host, Paul.
All right, Paul, how's it going?
Good morning. Well, Ian, we've been working on a lot of things. We've been working on a lot of things. level by listening in. I'm your co-host Conrad. I'm your co-host Paul.
All right, Paul, how's it going? Good morning.
Well, we've spent, you know, what feels like the past hour just regaling stories of our collective football victories this weekend. So this is end of summer, we're transitioning here a
little bit coming into fall. And that's always a fun time for our business too. But this is, this is football time. So now everybody gets to deal with the first three and a half
minutes talking about stuff. You might not apply to hear your day to day operations.
How are you doing, sir? Absolutely. I'm doing good. The funniest part about this bit is
that people are going to be listening to this three weeks later and we'll see what happens.
So as of this recording, you know, right after the first opening week of the NFL, again,
you're probably listening to this a few weeks later, but the Patriots won and the Vikings won,
which of course our respective teams, if you're a new listener, we may not know that. And
Paul and I were joking before we hit record that we're going to go ahead and put five
bucks down, the Heads in Bed show official ticket on a Patriots-Vikings Super Bowl, if
that were to happen this year. And of course you're listening to two weeks later and the
Patriots are probably lost now to the Seahawks and Jets, I'm assuming, again, I'm assuming if you're listening to this much later, that's already happened at this point., you're listening to to each later and the Patriots are probably lost now to the Seahawks and Jets I'm assuming again. I'm assuming if you're this much later, that's already happened at this point
And you're probably like what is wrong with this guy? They're one and two like they barely
You know, they got lucky to beat them in week one
but um
that's the thing about NFL is that it gives you this false sense of hope and it gives you this sense of
You think it's gonna happen even though you know deep down when your team's not very good that it's not gonna happen
But um any given Sunday is what they say, right?
And this this but like you're it was like you or your team was more likely to win.
That was a more possible and likely scenario.
For sure. One would argue this.
The scenario that I found myself in yesterday was very abnormal.
Tell my dad, oh, they're going to be terrible.
This is not going to be good.
Like we turn it on and then all of a sudden they start winning.
And I'm like, OK, like maybe I have to recalibrate myself.
So funny situation there.
It's it's funny.
I was I was also talking to my dad.
I was saying, well, you know, we this week after week one, we're going to be in the half of the NFL fans that are
dreaming of the Super Bowl. Everybody who lost, they're already dreaming about the draft. So
that's just, you know, after week one, you can do those two things. You win Super Bowl, you lose
draft. So we get to be in the Super Bowl side for one more. One week, I guess.
I get seven more days.
It'll be fun.
Oh, man.
All right.
Well, one thing that you can do when you have a small sample size is make some bad decisions.
So today's topic is data-driven versus gut-feel marketing.
What actually works better?
So it's actually kind of funny that this happened yesterday, although we outlawed this some
time ago.
This was the one that we had in our docket today.
I like this one a lot because I think that there's people that I've encountered in my
career who are actually too data-driven. And I tend to skew this way too. I tend to be the
person who's like, let's look at the numbers, let's review some of the analytics data, let's
see what's going on. But I do think that can really mislead you at times. So we're going to
go through some examples today of kind of when to ignore the data, when to collect the data,
or let's say when to embrace the data, kind of these two different schools of thought,
because I think like all things, it's not all one thing, it's not all the other thing, right? And
I've also countered a people in my career who refuse to look at the
numbers and only go off of like, Oh, well, I feel this way, or I
saw this one example. So they sort of extrapolate that out to a
longer time horizon. And that can get you in trouble too. So I
think, like all things, the right balances, of course,
somewhere in the middle, but maybe you can tell me your
leanings on this. I think when we're putting the outline
together, you said you tend to be a little bit more data driven
as well. Maybe we're similar in that respect, but walk me
through your thought
process at a high level there.
I mean, I always joked with with partners, customers, clients,
stuff like that, that I would rather have too much data, then
not enough. And I think that that's that's a good way to
think about it. Having the data is important. Now, are you going
to use the data all the time? No, I don't think you will. I
think it does. Inevitably, there's a little bit of analysis paralysis that if you're looking at too many
data points at some point along the lines, you're not going to be able to determine which one is
actually driving. I mean, it is if you're focusing on 50, 50 data points in a given report, what's
really a leading indicator? What's really a lagging indicator. And I do, I think just understanding
which of those data points are important.
And again, thinking about this kind of how we're gonna look
at it is when is the data good?
When is the data not so good?
When do you need more?
When do you have a sufficient amount?
I think one of those things that Google does a really good
job of in their experiment side of things for Google ads
is when they talk about
comparisons, A-B tests and stuff like that, it's statistically significant. That is something that
they do really make a good point to say that, okay, there are some changes here from the cost
per click or the conversion rate or whatever the engagement rate is, but it's not statistically
significant. You can always look at the numbers and say,
ooh, that's a really critical point,
but is it actually moving the needle?
Is it actually a key performance indicator?
Is it an indicator that you're actually going to want
to base other business decisions off of,
or is it kind of a fluffy number
that may or may not be doing?
I mean, it is, it's a number that's telling you something,
but it's not necessarily moving the
needle for you there. So yeah, I think that's kind of how I've
looked at it there. And I do, like I said, I always want more.
But I will be the first to admit that there are times when I
look at too much data and say, I don't know what to do with this.
So that's, you know, we got to kind of find that right balance
there.
100%. So I think that I'll go down some examples here. And
we'll kind of maybe trade off one by one here
as we get through the...
Sorry, first one, when to ignore the data.
I think data from some tools,
not only can it be a little overwhelming
to your point from a second ago,
not only can it be a little bit hard to understand
in parts of times,
it can straight up try to mislead you.
So here's an example that I've got.
Meta or Facebook will, when you run ads,
they love to take credit
under view-only conversion tracking. In fact, there's not really they love to take credit under view only conversion tracking.
In fact, there's not really a way to turn off
view only conversion tracking.
Meaning I'm running it out on Facebook.
I'm running an ad to target all my past guests, for example,
and I'm trying to get them to come back and book
one of my lovely vacation rentals
at this fictional company Conrad School Cabin Rentals.
So what Facebook and Meta will often report
is that someone saw that ad converted,
and then therefore in the Meta ads dashboard,
if you have the right conversion tracking hooked up, they'll say, Hey, someone
who saw one of our ads converted, therefore, I'm marking that as a purchase that was attributed
to my ads. Maybe right, like here's the trouble that I have with that line of thinking. They
saw the ad in the sense that the ad was served to that person. Did they see the ad? Did they
really see where they really influenced by the ad? Or was it just kind of washing over
them amongst the many other things? And the answer, by the way, here is I don't know the answer to
that question. It really depends. It's completely situation dependent. Some people may see that
ad and go, Oh, my goodness, I need to go and book this vacation rental right now. They'll
go and do a Google search, which is often how people go, right? They might see an out
of Facebook, do a Google search and do that reservation. And that well, then Google gets
credit for our analytics tools, right? Like the data would tell you that Google converted
Facebook is telling you that they got the conversion.
Who's right, who's wrong?
The answer is they're both right.
They're both wrong.
You know, it's kind of it's kind of one of those problematic pieces of the equation there.
So that's a good example in my mind of like, you've got to maybe either be
willing to ignore that data or not just will be willing to take that data at face value,
particularly when ads can get stuffed in a right side rail on Facebook, on desktop.
And I mean, I see ads load up there all the time
on my just personal Facebook usage.
And I go, I couldn't tell you a single advertiser
that I've ever seen convert off of a right rail ad,
myself personally.
So I feel like the guest on the other side
is probably not converting on those,
but Facebook can serve an ad there
and then take credit for it and say,
hey, this person's on ad and converted later on.
So the attribution, Nest is one that we probably
can't untangle the next 30 minutes or so,
but that's a good example of one where a lot of tools will try to take a lot of attribution nest is one that we probably can't untangle the next 30 minutes or so. But that's a good example of one where a lot of tools
will try to take a lot of attribution in place.
We actually tested Pinterest ads for a client this summer.
And Pinterest would take a 30-day view conversion
and a 30-day click conversion.
So if they view that ad and then convert it
at any point in the next 30 days,
Pinterest is saying that they influence that conversion,
as if you can remember a pin you saw 29 days ago.
I don't think so.
So that's a bit egregious.
But the point is, some of this data is set up and almost designed in a way, I think, to mislead you a little bit,
or at least make you think there's more influence there than there is typically. So thoughts on data,
view tracking, and meta. Attribution that it is, it's one of those data points that everybody
wants a piece of the puzzle. Everybody wants a piece of the pie, whatever that is. And the
bigger piece of pie you can demonstrate, the more likely that is that you're going to use
that channel because it is you're seeing those things. So I do. I think attribution is such
a tricky one, especially with Facebook. Now, I think that's something where we'll throw
it just to a sidebar of source here for a second. That's where when we talk about agency
work and bringing
everything under one roof, that is something I think that when you have two different agencies
doing two different things and trying to justify their own costs and doing stuff like that,
I think that's where ultimately the end user, the advertiser fail or gets the short end
of the stick there because your Facebook agency is trying to get to this attribution over here.
People running Google are trying to show something over here and indirectly you're kind of playing
each other against each other just because, well, no, this is a 30 day window.
This is a fact.
I mean, have I gone through and increased the attribution windows excessively at times
because I understand it's a longer buying process?
Yeah, I absolutely do think that's the case.
So is that good data?
Is that something that is really beneficial to us?
And until you have any uniform nature by which you're collecting that attribution data, it is.
It's hard to say.
I mean, that's where we've talked often recently about the data-driven attribution.
And how does that really, you
know, where does that pick up little touch points along the way? Is it is your
Google Ads conversion getting a 30-day window or a 15-day window or a 90-day
window or what does that actually look like there? And you can, you can adjust
that. So that's something, I guess, anytime you're measuring the data
period, you have to determine, you know,
is that good data, bad data?
The old, you know, the CRM example being good data in,
good data out, bad data in, bad data out.
Well, if we're making all of our decisions with bad data,
are we making good decisions?
Are we making data-driven decisions with bad data?
So I think that's something that,
especially on the attribution side,
if that is bad data, where it is a one-day view, or it's a 30 day view window or something like that,
are we really making the best decision about which campaigns are going to drive the best
performance down the road?
Yeah, probably not.
Yeah.
No, that's the thing too, is like, you mentioned a good point there where it's like two different
agencies that are sort of battling, which I mean, we're not in a lot of those relationships, you know, that's
no, no, no, that's where in a few of them, though. And yeah,
it's always true. I mean, like, there's there can be respect and
mutual understanding, like, hey, this is the sandbox you're
playing in this the sandbox I'm playing in. But imagine the
client on the other side of that, I frequently think about
their problem, where it's like, alright, this company's running
the Google ads, this other company is doing my SEO, for
example, well, a lot of them are gonna, you know, take credit for
the others work in some respect, modify and say,
that's fair, this is fair. So it does make it hard. I don't end
people in that position. Because imagine a scenario where you got
10 direct bookings, just to make it round number and make it
simple. And the Google Ads person is saying I take credit
for eight of those and the Facebook ads or meta ads person
is saying I take credit for seven of those. And you're like,
wait a second, that adds up to more than 10 bookings. How is
that possible? You know, it's like I can see someone getting
confused. Ultimately, they're both taking credit, partially for that conversion. And to be fair, like, again,
to be like, I think, honest here, they both might deserve some credit, like you might see,
yes, convert on Google, and vice versa. So it is tricky. There's not a one size fits all.
The one like real gut check moment here would be turning off your ads on one platform and seeing
what happens, right? That's the nuclear option where it's like, what if I turn off all of my
Facebook ads? Let me see what happens to my overall conversion rate and stuff like
that can be tricky with seasonality and things like that,
right to do like a fair two month comparison or something
like that. And then this thing could be set for Google. And I
think it's always interesting. I saw this heuristic a while ago,
I forget who I saw it from. But it was if you have a service so
essential where if it turns off, you know, your clients are like
customers are calling you on the phone trying to get re enabled,
like my internet cut off, I'd be calling spectrum and being like, yo,
like, I need my internet to do my job, right? Like, which proves
the spectrum could basically charge me whatever they wanted
to for internet, and I would pay because I needed that much.
Funny little note there. But yeah, if if your Facebook ad
cards go down, you know, I've clients are a little more laze
fair about it when clients Google ads card go down, and
they know the data, they're like on me, you know, like, hey, we
got to get the new card in there today. Like, we don't want to
be missing opportunities from a guest book perspective. So maybe that
answers the question right there on which methodology I think is a little bit more solid
with that tracking. All right, let's look at the page a little bit when to collect more data or
when to kind of, you know, go through more data. So this is one that we talked about a length
previously, Paul, talking about the length of certain things, people will get a number in
their head one time, hey, the average homeowner stays on for x number of years, maybe your company,
that's two years, or maybe it's five years, maybe it's seven years, or 10 years,
whatever that number happens to be. And then they're not going back and updating their
assumptions with enough regularity, and they kind of end up making some bad decisions. So I think
you've seen this up close and personal over the past few years, where some people have really bad
homeowner retention, some people have phenomenal homeowner retention, and of course, everything
in between. But you can't assume, well, because some companies keep a homeowner for 10 years,
I'm going to keep a homeowner for 10 years.
Maybe you do, maybe you don't, but you got to go back and measure it.
And most importantly, I think you've got to go back and collect that data frequently.
Maybe every year you've got to go update that and say, my most recent cohort, how do they
look?
Are they going to keep signing on or am I getting some people peeling off already?
So that's the example we had here.
Maybe you could spend on that a little bit more as it relates to assuming metrics, homeowner
retention being a good metric that people often assume at times and
correctly. I think retention is that key one. I think and revenue too. I think
retention and revenue are two of those things that you just don't know what's
going to happen. And it's a such a critical part of that relationship with
those homeowners that it doesn't need to be. It doesn't need to be just
about the number. You've got it in the outline is,
don't assume metrics you can't control.
That is something that I think any professional
property manager has made a wrong projection,
rental projection, ROI projection, anything like that,
over the past few years.
And certainly that's not uncommon.
I mean, it is in our topsy turvy world of bookings and revenue
and everything like that and pricing tools and everything that you can control and you
can't control. I think ultimately, you can't control demand for your market. You can't
control the number of people who are going to do searches. You can't control the number
of people who actually own homes and who are interested in listing a home in your area.
And that's something that unfortunately we felt that more often than not is that, yeah,
I can find you 5,000 absentee homeowners in your area, but does that mean that they are
the right owners, that they are the right people to be marketing to and doing things
like that?
We get into a lot more of those assumed metrics
on the owner's side because you do, you have to sell a little bit of dream there. You're
selling an experience on the guest side of things where you can create that desire and
sell them a dream, but it's that experience dream. On the owner's side, you have to sell
a dream of this is what I'm going
to produce for you. Ultimately, your home is going to be cleaner, safer, better maintained,
and it's going to earn you more money than it is right now as a long-term rental, than
it is as a short-term rental right now, as you, with you self-managing it with all these
things. You're, you're selling against a lot of different things. So to be able to collect
as much data as possible, to be able to, again, you're not going to know every number, but to
know what data to collect to move the needle for that homeowner story, I think is one of those
things where you may have to dig more and more than your competitors, or you may have to do some
things that are a little different. We talked about last time,
you might have to go to the county records,
you may have to go and find out
what specifically you need to collect on the data side.
So I think data or data data is very important
on the homeowner side of things.
And it does, it takes a lot of different forms
on the homeowner side of things.
But the more data you can present
is going to be very beneficial for that homeowner and
the entire experience you're going to have with them. Yeah, I think the main trouble that can
arise from making assumptions is that you may hear an industry standard, but any industry standard
usually comes with a bunch of averages. And of course, in any average data set, there's going to
be outliers, there's going to be people that are way below and way above those sort of metrics. So
I think, yeah, you
can hear from someone and say, hey, the average homeowner may
stay on your program for 10 years. Awesome. Your your number
may be two, I mean, heck, it may be 12 or 15. Right? Like you may
be above average. But knowing knowing that is too important.
That's too important of a thing to mess up. The other thing I
was going to react to is that you talked about some of these
things are really subjective. Like, is it, you know, if I go
through a year with property manager A managing my property, if I go through a year with property manager A managing my property,
and I go through a year with property manager B
managing my property,
revenue was such a black and white thing.
Well, this person delivered X, this person delivered Y.
I can compare that very easily.
Better reporting, you know, okay,
maybe I can measure that a little bit.
A better cleanliness, that's kind of hard to measure.
Maybe I can, like I can look at the number of complaints,
you know, that sort of thing.
But a lot of these things are kind of subjective.
You know, like, do I like working more with property manager A or B?
I think sometimes people overstate how important revenue is when they really
sometimes just want someone taking care of their asset, taking care of their
home and they want revenue.
Revenue is amongst one of the key decisions in their, you know, in their
overall matrix of, of, you know, data points they need to collect, but it's not
the only thing they care about.
There's usually other things that play there.
And I think some people understate that or misstate that.
So I like that one quite a bit.
Um, and yeah, you know, like
the example I gave was legit. We have a client who has lost 30 minutes, 30 units, excuse me, in the last like 12, 15 months or
so due to property sales. I mean, property sales in this
particular market where this client space, where the highs
they've ever been. I mean, like we're talking double triple what
anybody expected the property to be worth. I think a lot of
owners looked at it go, I like going here. I like this
property. I like the money I'm earning. You know what else will be nice selling it for
$750,000 and putting that money in my bank account. That also sounds pretty nice too. So I'm gonna go
ahead and do that. And sorry, Mr. Property Manager, I've loved working with you. And maybe you can
send me a Christmas card, but I'm not going to be owning a property in this market anymore.
It's run up triple what I expected it to be. I'm going to shake hands and move on. Right. So some
of these things are completely outside your control. For sure. All right.
Going back the other way, when to ignore the data or when the data can be very misleading.
Here's the example, because I've encountered this one quite a bit myself, a large but dead email list.
So low open rates, low click rates. And for those that are engaged, though, you might get some benefit from it.
So this is unfortunately, a relatively common thing that we encounter and haven't
encountered many times over the years here at Buildup Booking. So Klein comes to us and I think they,
I don't know if it's like a, if it's like a little bit of a male ego
impressive, trying to impress us type thing
seems to happen more in that scenario in my experience.
But we're on a sales call, we're early on in the process
and we go, what's the size of your email list?
And people always wanna say the biggest possible number.
They always wanna say,
well, I got an email list of 10,000 people.
I got an email list of 20,000 people.
They wanna say the biggest possible number.
And I go, awesome, tell me a little bit
about how you've been communicating to them.
Let's say over the last year or something like that.
Well, you know, then, and then the, uh, the nuclear bomb gets dropped.
Um, yeah, we've been sending any emails out.
So I'm like, honestly, you could have six billion, you'd have every
email address on planet earth.
You'd have 6 billion email addresses.
If you don't actually send to them, then we're actually not going to get any
benefit out of it, number one.
And number two, it's going to kill the performance of that email list over time.
Cause people just forget about you. It's this uncomfortable,
unfortunate fact of marketing, which is that all the marketing we do is forgotten, you
know, 24, 46, you know, 72 hours later, after we do it, people forget about it because they
get inundated with 1000 more marketing messages. So that's all the bad, I guess. But it can
be very misleading to go back to the premise of, you know, this point here, because we
go look at the data and we go, we have clients that get five, seven, 8% open rates. And I go, that's objectively
horrible. We have clients that get 60% open rates, but actually wouldn't trade one for
the other. Because would you rather have 60% open rate on a small list, or 10% open rate
on a massive list? The answer is you'd rather actually have 10% open rate on a massive list,
because it's actually going to provide more valuable, you know, value to your business,
particularly those folks convert. And that's sometimes what we see, you know, these large
email lists that have, you that have relatively low open rates,
but the people that do open actually go through the process,
they make a conversion, they make a booking on the website.
So email marketing is still a very plus benefit
from that perspective.
Now the obvious answer that someone listening right now
may go, well, dummy, just go clear out all the dead ones.
And that's kind of a logical thing to do.
I don't disagree with that.
The only trouble I've had with that recently
is that that cohort of that 10% open may change a lot over time. So if it's always the same 10% opening over and over
again, that's an easy call, right? We can just go ahead, hey, it's been two years, this person has
engaged, let me go ahead and dump them off. But when that 10% is changing a lot, and it's like,
well, it's this 10%, then it kind of shifts around over to this 10%, the actual number of people that
didn't open a campaign over a 12 or 24 month period may actually be a lot lower than you think. It
could be 20% of people didn't open anything over a year. That, that even
that metric, by the way, has gotten a little bit trickier. Again, when can you trust your
data? Now so many email marketing platforms block open, you know, statistics and block
open data. So it could be that folks are actually opening that email. Maybe they're not clicking
on it because that's a little bit more reliable form of tracking, but they're opening on it
and they're not doing anything with it. And you assume, well, you know, like, they're inactive, go ahead and remove them when they
were getting every email and dreaming about going to your
destination. And then you take them off the list, and they
actually end up wanting to book, you know, a year later. So I'm
a little more gun shy about removing folks from lists than I
was, you know, in the past, because I've seen lots of
stories and examples of our own marketing, where it's not
always clear, you know, what that what that change looks like.
So anyways, maybe you have some similar stories here. But yeah,
when to ignore the data
email marketing.
I think on the email marketing side, the most important data
point is going to be that bounce like like, if you're bouncing,
and like, it's those negative signals, I think the positive
signals, yes, you're using those to understand how much of the
audience you're reaching and stuff like that. But I would
agree, I'm very slow to remove someone
from potential marketing communications
that I can do to them just because it is.
It is more of the long game.
On the inventory side, it was, okay,
put them in the cold email sequence
or the long-term nurture sequence.
I think that's something that you do need to think about what a more of a long term
nurturing travel of sequence it looks like maybe it is an email every three weeks or
something like that.
But that will build up some of those other more important KPIs, the open rates, the click
through rates and those true engagement rates.
But I think the most important data to look at though, it can be those those negatively,
the unsubscribes and the bounce because that is going to ultimately affect your deliverability,
which is going to more to the technical side, but ultimately, you know, strategically,
can you email as many people if you're you're getting more negative responses and doing stuff
like that. So I think that's one of those where ignoring the data on some of those,
and doing stuff like that. So I think that's one of those where ignoring the data on some of those
some of those again in performance metrics on the email side of things, but really focusing in on the things that are going to impact your deliverability. Yeah, I agree. 100% there.
All right, we've done email marketing. Let's flip it back the other way when to get more data.
This is one I'm pretty adamant about curious your perspective on it. So everyone loves to talk about pricing, you
know, like there's a billion pricing tools out there. We've
got pricing experts out there. And these people are critical,
by the way, of not demeaning their importance or their value
in the equation. And it feels like no one ever talks about
testing fees, like every little once in a while, they're here a
little pocket of like, oh, we tried this fee, we tried that
fee, that sort of thing. And like it or not, fees are here to
stay, they're essential part of kind of the stack of pricing as
it comes to, you comes to marketing and promoting,
ultimately selling these vacation rental properties
and things like that.
But no one ever says,
I've never had a client really come to us and say,
you know what I'm gonna do?
I'm gonna try a cleaning fee at 50,
and then we'll try cleaning fee at 150
for the next 30 days.
Let's do an A-B test or let's put it
on this set of properties,
not this set of properties and see what happens.
But ultimately things like that can make a huge difference
as far as your profitability,
as far as what side of the ledger
some of those dollars kind of could flow into,
whether it's your side, whether it's the homeowner side,
whether it's just going to a marketplace,
if you're getting a booking on a listing site
and so on and so forth.
So it seems like no one ever tests fees.
Like they just kind of come up with,
oh, okay, the cleaning costs X, I'm just gonna charge X.
Or this particular thing costs Y, I'm gonna charge Y.
And no one ever does any testing for it.
So just, I don't know if I have a really good like expanded thought here other than like
it seems like an obvious thing to be testing because what impacts every transaction, every
booking you make the fees like that impacts your again profitability significantly. I was talking
with brooke about this not too long ago and they changed their res fee when he was advantage years
and years ago and he said just went from 39 to 99 we saw almost no drop in conversion and it literally
tripled our profit basically, you
know, by that one decision.
So like that cemented in my mind that like, this is a really solid thing that people should
be data testing around fees, because people we did, we did it has the heck out of pricing
as it relates to rent.
And we feels like we do very little, you know, price optimization when it comes to fees.
So I don't know if you have any extra things that in here, but it's just more of a battle
cry than anything on testing those things.
I'd be very curious the results of some of those information.
I think that that's especially when you do and again, one of the bigger black spots on Airbnb right now is fees and having that full transparency into the fees.
I mean, that's something that just consumer base why that's something we all have to be
in consideration of and not putting those junk fees on.
Thank you, Ticketmaster.
But it's where we're kind of bouncing around to. in consideration of and not putting those junk fees on, thank you, Ticketmaster.
But it's where we're kind of bouncing around to.
It's probably not as much so on the homeowner side,
although we have seen people test
kind of those commission rates
and test different special offers,
but again, it's not quite the same way
where you're impacting,
let's say 20 transactions a month from on the booking
side, hopefully more, depending on what you have in your thing, but you're probably only
going to do it once or twice. You're probably only going to sell once or two homeowners
or bring one or two homeowners, maybe three or four homeowners on in a given month. So
I think that really is one of those things where you can kind of make those. But I think
any time you're testing rates, rent, you know, any special offer, really
any promo, that is something that you really have to have
enough data to be able to make sure that we're talking about.
And I think we've got it statistically significant going
back to that is, is that you can't just assume that because,
you know, I'm going to guarantee guarantee you a booking in the
first 30 days, we're going to pay $5,000.
That might not might be one of the reasons that might have caught their eye. But it's not
necessarily what brought them and made them choose your management company ultimately there. So I do
I think any anytime you are dealing with moving numbers around, is it really that number that's
do I think that anytime you're testing those variables, you know, what is it? Hey around is it really that number that's do I think that anytime you're testing those variables?
You know, what is it? Hey, is it actually that number or is it different numbers? Is it is it the the the fees versus the actual?
You know rental rate or whatever that is there. So I
It's fun. Yeah
Well, it's that explains why it's so hard to do these tests to be fair, right?
Like to give kind of the other side for a second is that you're testing during very different
periods and that does make it more tricky.
Awesome.
Well, a few more things to get through here.
So this is a good one, I think, on the when to ignore the data side of things.
Marketing something new or unproven, right?
If you had a chance to kind of study some of these so-called viral Airbnb projects,
talk with Isaac before he sold Live Oak Lake, for example, and it was like, there wasn't a lot of data to like indicate that that was
going to be successful. So like, you couldn't go into like Waco, Texas on
air DNA and be like, oh yeah, like people are definitely going to pay, you know,
three, four or $500 per night for like a, a small, you know, resort style,
Scandinavian inspired property. Like there was no data to indicate that was
going to happen. And yet, you know, he kind of forged, forged ahead and was very
successful with that project. Pacific Bend is another
one that comes to mind on Instagram, I've kind of
followed that account for a little while now, building
shipping containers in the middle of the forest, like, again, not
a lot of data to necessarily indicate that that's going to
work. So if you went into air DNA or any, you know, insert
your tool of choice here, and you try to justify these
decisions, you would have a really hard time justifying them.
And yet they've been very successful. So yeah, just more
of a comment or a note there on like, when you're when you're
introducing a new type of property, a new type of inventory into the market,
you can't really go and rely on past data very much, you kind of have to like, forge your own
predictions, you've got to kind of skate really far ahead of where the puck is going. With regards
to like, guest demand, you've got to rely on your own taste, you've got to rely on maybe a little
bit of data around things like demographics, or you know, maybe, oh, I know how to make videos,
therefore, my projects will be successful, because I can get a lot of people watch my videos, these
are all like reasonable sort of assumptions
to have. But you certainly can't say I based it on the data
because there was no data to kind of back up, you know, some
of those theses that you have there. So, you know, not that
this is super common, I would argue, to be honest with you,
Paul, in the vacation rental management world, where it seems
like people want properties that kind of fit into a box. I mean,
they want great quality properties, of course, but you
know, they don't want something that's that abnormal or that
unique, like it all kind of should fit into like a similar
kind of brand design, you know, brand ethos and stuff like that. So I feel like this is more so done by that, like individual host that individual owner that wants to like do something a bit wild. We've seen these happen over the years. And yeah, I just think data is not your friend in these scenarios that you just got to jump sort of in not know exactly where you're going to land, unfortunately.
I do. I think that that's something that it's the same concept as people who have do condos, condos, condos in some of these beach markets, and then they bring on a full house. And it's
just kind of like, you can give the projection that you want to, or you can anticipate what's
going to happen, but you don't know. You just don't know. And I think that that's something
that on a larger scale, yeah, I think the Live Oak is a great example of that,
of figuring out what that ultimate audience was going to be
and justifying the build that way.
But I think that's where I've seen it most frequently
is you get these people who put significant periods of time,
months and months and months of selling
to get these incredible properties on
because and they and they've they've done everything based on the data and the models
that they have in place. But this is a different this is a different time. So yeah, I think you
do you have to you can certainly try to model things out and project what it what things are
going to do. But until you actually have one of those big, beautiful new properties in your
portfolio, you don't know what's going to happen.
So I do, I think you have to kind of go outside of,
be willing to take those risks
as you probably were in bringing it on initially.
And then ultimately you got to deliver on those expectations.
So I think that's a perfect example.
And something that kind of falls into that,
knowing your brand, knowing what you know,
and then figuring out what you don't know along the way.
It's not compromising on the brand there.
That's maybe that gray area, but you're, you're, you're going to learn some things
there and probably create and collect some of your own data at that point too.
So,
yeah, well, actually that, so that piece does nicely into where I think people
could use to stand more data.
So once you've got a website stood up, you know, I think this is one of those
areas that people just tend to, again, there's so many metrics. It's like, what in the heck do a website stood up, you know, I think this is one of those areas that people just tend to again, there's so many metrics, it's
like, what in the heck do I even look at, you know, time on site or time on page feels
like a decent one. Well, people spend a lot of time on a page, that's probably a good
sign. Generally speaking, I think that's true. I was talking to the client recently, was
a pretty high time on page from certain traffic sources, and a really low one from other traffic
sources. And I go, look, generally speaking, over the long term, you're going to get more
conversions from this one where people are spending two and 1 half, 3 and 1 half,
4 minutes on this page, versus people
that are spending 25 seconds.
That's just the facts of the situation,
with all the data that I've seen over almost a decade
at this point.
But I would argue that sometimes it's
better to collect more useful data.
So for example, time on page, you can get misled by that.
Someone could spend a lot of time on the page
just because they're looking at the photos.
I think I've showed this example before.
But we have clients that have actually
got viral traffic from Pinterest before, because they just like the way the. Like I think I've showed this example before, but like we have clients that have actually got viral
like traffic from Pinterest before,
because they just like the way the property looks.
That's like cool cabins.
And then they go and look at the cabin
because it's aesthetically pleasing,
but they're not actually looking to book it.
The cabin's in Tennessee and they're based in Florida
and they have no desire of traveling there, right?
So you gotta be careful with some of those things
because they can get misleading.
One thing that I find is more effective,
how people are using the website.
So not like what they're doing
just purely from a time perspective, but for example, putting in dates into a property detail
page love that starting the checkout process. These are very healthy signs of engagement. These
are metrics that usually correlate a ton better to actually people making bookings in my experience
versus other data points like time on site. So these are two that come to mind from my perspective
as really much more solid ways to collect data. If people are starting the checkout process,
they put in dates, they start the checkout process.
I would take 10 of those every day
over someone that's spent 25 minutes on the site,
not knowing if they actually are gonna actually
go through that booking process or not.
So yeah, I don't know if you have any examples like that
on your side of things, but those are ones where I think
collecting that data or setting up that tracking
can be really impactful.
That thing setting up the tracking
is just what's so important.
Knowing how many people come to your website,
obviously that's important.
Allows you to create that baseline of what's your conversion rate, you know,
how frequently are people getting there? Are they making it down the funnel doing all those
things? But I think it is, that's all it is. I talk about clarity, I talk about understanding,
you know, that that's that's certainly a way that we understand not just how long people
are on the site, but how engaged are they? How many clicks are happening? How many clicks
are happening in a very short period of time? You may have a 20-minute session,
but there's only like two and a half minutes of activity there. So that's the other thing
to understand is that clicks are great. Clicks of buttons, clicks of calls to action are
really what's actually driving the activity down that funnel there ultimately. So I do,
I think where Google Analytics does, and most analytics platforms do a good job
of presenting you with the high level information,
that 10,000 foot overview,
I think you can be much more effective,
and then collecting more data at that 500 foot overview,
and just making sure that you're not just taking
what the template reporting is,
you're really going down to the items or the data that makes your business money,
makes your website run more effectively, whatever that is.
But I think the website is the best example of there's a lot of data there.
Understanding what is actually moving the needle for those metrics in analytics or in some other system that you're using,
that's where you're actually moving the needle,
making changes based on those,
what that data is telling you
and hopefully improving the overall performance
of the website.
Awesome, I think I've got time for one more here
on my side of things.
So I'll throw it to homeowner marketing.
We'll go back to your writing pretty quickly
on this one, Paul.
So this is when to ignore the data.
And this is where I think, again,
going back to that idea of data can be misleading.
So the example that I have here is homeowner marketing.
Sometimes the thing that gets the best response rate may not actually
provide the best quality of leads.
Or I can do a quick guest marketing example, just to kind of prove a different
data point, the ad that gets the best click through rate may not always provide
the best conversion rate you would think that's the case, but sometimes not.
I'll give an example of that really, really fast.
So we have a client, very luxury, high-end properties.
We're talking 2,500, 3,500 a night minimum.
They really go from there, depending on the holidays
and things like that.
We have an ad unit that converts very well in Google
that says in the headline,
prices starting at $2,500 per night.
Now that kills the click-through rate, right?
The click-through rate's kind of crappy compared
to every other ad we run on this account,
but it significantly vastly improves the conversion rate.
Right, so that's an example where we're actually sort of excluding certain people on purpose.
If you don't want to spend 2500 bucks a night, basically it's sort of implied in that don't
click on this app. And that seems to really hurt the number of people clicking, but those
that click understand maybe a little bit better of what they're getting into. And by no means
does this have a hundred percent conversion rate or even close to it, but it converts
better converts almost like twice or three times as good depending on the data set we
look at than the more plain ad that just says, come check out our luxury vacation rentals. So
I'll kick it back to you in the homeowner example. Again, response rate or interest rate may not
always provide quality. And maybe that's something you can speak to before we wrap this one.
Yeah, I mean, and the homeowner side of things, it's, it's, unfortunately, you kind of have to
throw the whole it's not unfortunate, but you have to throw the whole omnichannel
at these individual users because it is, it's a different sales cycle that you're doing.
And what we fell into the hole a couple of times of, we would turn specific channels
off because we didn't think they were performing. And then we saw pretty dramatic reductions
overall to the, to just performance of everything. So I think ultimately
that's kind of a position that you don't want to get yourself into whether you're ignoring,
whether you're collecting more is coming to a point where you turn a channel off and that's
when you understand that you don't have enough data built up. That that's when you understand,
oh, this was making a positive impact on my
overall marketing strategy, I need to turn that back on. That's something that too often,
on the Facebook side of things, I have had that experience of Facebook isn't converting at all.
This is terrible. I mean, why would I go through a channel that I have a 0% conversion rate that's
not engaging on the website, that nothing's really good is happening. Well, it's because when I turn that channel off, all the other channels have that trickle down effect of
nothing, nothing's working. So that is, I think that ultimately how you're reading the
data is going to be how you read the data. Get it in a way that you can make business
decisions off of that, but make sure to understand that like if, if the data is telling you that,
oh, that was the wrong channel to turn off either turn, you know, turn it back
on or find a way to more effectively use that channel, because I think on the
homeowner side, we do, we see that a lot of, oh, I'm not going to use do postcards
anymore because they're expensive and I'm not getting any leads.
Well, you may not have gotten those leads through that initially, but there's a lot of steps along the way talking about that attribution path and, and I'm not getting any leads. Well, you may not have gotten those leads through that initially. But
there's a lot of steps along the way talking about that
attribution path and really understanding all the touch
points along the way and not not taking one out just because that
gut feeling of I don't need it. I'm spending too much. I'm not
getting getting the leads back from it or I'm not getting the
return that I anticipate and that maybe was a heavier driver than you thought.
So yeah, right on.
Awesome.
Well, I think that's all the time we have for today, Mr. Manzi.
Again, our rational exuberance comes through
with respect to NFL games.
Our rational exuberance comes through with respect
to data-driven marketing.
So this episode is permission.
If you're the person to skip over the data,
no, I know how I feel.
I know how my business is going.
Slow down, get some more data, get some more information.
Let's make, let's kind of blend those two things together.
This episode is also your sort of warning. If you're maybe a little bit more like me
and Paul, you want to use the data overly, you know, too much to kind of rely on our
decision making process. Well, let's, let's think of the context. There's nuance, nuance.
That's a good word for today's episode. There's nuance and there's data. We've got to blend
them together, you know, sort of like a, a perfect, you know, blend of all things, chocolate and peanut butter, if
you will, to get the right outcome. So if you made it all
the way this far deep in the episode, we super duper
appreciate you. One thing that's kind of data driven, but
makes us feel good is to leave us a review. So go to your
podcast, stop the choice, click five stars, we collect these
reviews, we make they mean a lot to us. Those more people listen
to show so we super duper appreciate that. Otherwise, we
thank you listener for getting all the way to the end, it will
be have an awesome day. And we'll catch you on the next
episode of heads in bed show. Thank you so much.