The Data Stack Show - 210: From Reporting Basics to AI Automation with Eric Dodds and John Wessel: Navigating the Complexities of Data Standardization, Observability, and Business Alignment
Episode Date: October 9, 2024Highlights from this week’s conversation include:Reporting and Analytics Discussion (1:09)Automation in Reporting (3:16)AI's Impact on Analytics (5:00)Data Quality Challenges (6:56)Reinventing Repor...ting (9:23)Automated Reporting Services (14:35)Growth Trajectory of Reporting Tools (16:01)Market Size Comparison (18:04)Static vs. Time Series Data (21:27)Differentiating Reporting and Analytics (26:26)Ad Hoc Analysis vs. Reporting (29:52)The Role of Data Scientists (34:03)Planning the Reset (38:36)Focus on Business Problems (41:30)Identifying Business Needs (44:01)Heuristics and Intuition (48:03)Importance of Monitoring (52:20)Observability in Analytics (55:53)Observability in Metrics (58:08)Consistency in Definitions (1:02:39)Impact of Changing Definitions and Final Takeaways (1:04:15)The Data Stack Show is a weekly podcast powered by RudderStack, the CDP for developers. Each week we’ll talk to data engineers, analysts, and data scientists about their experience around building and maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com.
Transcript
Discussion (0)
Hi, I'm Eric Dotz.
And I'm John Wessel.
Welcome to the Data Stack Show.
The Data Stack Show is a podcast where we talk about the technical, business, and human
challenges involved in data work.
Join our casual conversations with innovators and data professionals to learn about new
data technologies and how data teams are run at top companies. Welcome back to the Data Stack Show. We have a little
bit of a special episode for you today. We're recording this straight to the computer without
internet because we had severe weather in the Southeast and it's caused a pretty big mess for
a lot of people in the Southeast, including us here in
Greenville. Our hearts go out to everyone who's been affected by this. There are lots of people
without power, and there are lots of people, especially in Western North Carolina, who are
in danger. So definitely send your care and help out in any way that you can. So today, you get
John and I alone, and I got to pick the subject, which is reporting
and analytics. I love talking about this, and I can't wait to hear all your thoughts because
you've done this for a long time. But when I say reporting and analytics, John, what topic do you
want to most hit? Yeah, I'm excited about this topic. But first, our thoughts and prayers are
with those, and especially we're in the Western North Carolina area, which is just 45 minutes from us. So if
you're listening in that area, our thoughts and prayers are with you for sure. So in the reporting
and analytics, that topic, it's a little bit out of vogue, I would say. Like if you look back 10 years ago, reporting analyst, fairly popular job title, data analyst.
Now, like you're saying, like data engineer or data scientist or analytics engineer, of course.
Yeah, we got, yeah, with DBT Coalesce around the corner here.
Analytics engineer.
So I think reporting in some ways has taken a back seat from a like high level view but practically
almost every business has some kind of reporting going on sure of course and in I think it's as
far as the topic reporting analytics like that whole space like how you run your business like
in my mind like that is typically off of reports or some kind of reporting adjacent thing. And then you get into what I might call studies, more like engineering
studies or data science type work. That might be how you make your business better. A lot of times
I think of that as more of an ad hoc type thing. We need to do, we should get into definitions
for all these things. Okay. Yeah. We'll definitely get into that. I'm going to ask you to define all those. Yeah. Okay. Sorry, I interrupted you. What else? Yeah. Yeah. And that,
and then I think just understanding how businesses are thinking internally about these topics where
we have a different definition of roles for a lot of companies, like we're talking analytics
engineer here, but more so in the reporting and analytics. You've got these roles
like data scientists, and then you've got these practical jobs like reporting. And there's just
this mix up of things. There's just this soup of things. Yeah. I'm excited.
Definitely getting into those. Okay. The things that I'm interested to pick your brain on,
one, how possible is it to automate some of this stuff across business models? I think
that's a really interesting topic. There are companies that have tried to do it. So I think
that would be fun. And then if we have time, I want to ask you about the culture around reporting
and analytics, because so much of this is not a technical, you know, the questions are not
technical. It's actually like, you know, who is using the reports? How are they using them?
How are decisions made, right? Communication between the teams, between the data team,
the business teams. Yeah. Okay. We'll try to get to all of it. Let's dig in.
All right. Sounds good. John, the experience I had last week that brought this reporting and
analytics discussion to light was we were talking
with one of our very early stage investors, and he was looking at a bunch of different analytics
from a bunch of portfolio companies. Reports, actually, which we should talk about reports
and analytics. These were reports, okay? And it's startup companies, you know, and so things can tend to be messy and that's fine.
But there wasn't really a lot of consistency
across the companies in terms of KPIs
that probably should have been, you know,
pretty standard across these different companies.
And he just made a comment, which is funny,
but makes a great point that, you know,
everyone's going crazy over AI
and we still can't do basic analytics. Which is that, you know, everyone's going crazy over AI and we still can't do basic
analytics, which is true. You know, just very few companies even do basic analytics really well.
And I think the ones that do advanced analytics are probably pretty rare. Why is that, do you think?
I think that's a good question. With the AI thing, part of the craziness behind the AI is that like, oh, well, you can do this.
Like all that, you know, because reporting in analytics is hard, right?
Like the promise of AI is the promise of easy, right?
It's like, oh, you do that, but AI can make it easier or, you know, maybe better.
So I think that's how AI ties into it.
But as far as like reporting in analytics, analytics, you're mentioning portfolio company here.
So I'm assuming this is like,
they're all similar companies,
maybe they're all SaaS companies, similar sides.
There's a lot of commonality, right?
Yeah, a lot of commonality, yeah.
I mean, different stages,
but at least a baseline threshold
to where there should be some level
of B2B SaaS consistent KPI reporting. Yeah. should be you know some level of like b2b sass consistent you know kpi reporting yeah so i think
i do think it's industry it is industry dependent and i agree that the majority of the industries
do not have like great reporting but i think there's that like some financial industries like
there's some industries where like there's i think a really high bar and they're like very organized
and have standards yep and i would say probably
tending toward financial or like more heavily regulated industries which i don't have a lot
of personal experience with like there are standards like if you're trading stocks like
there's a lot of data and there's a lot of like organization around that data because it's like
really important right yep whereas if you are maybe a software startup or you are a manufacturer, like there is less standards
around like your customer data or your, you know, order data as far as like, how should you model
it? How should you think about it? And that's just, you know, I think just the evolution of
certain industries, like it just, it doesn't need to be standardized, therefore it's not.
Yep. Yep. Yeah. It's interesting. I think go-to-market models vary so much as well, even among B2B SaaS companiescom i do think has some advantages as to a higher level
of cleanliness of data because if a lot of your data is click click stream like data that in a
sense like a well a computer generated it right so it's better whereas if you're in an industry
like manufacturing and like all of your sales orders were entered by into a system by a salesperson like you may have like input data
problems that that are just wrong we're just practical like i need to like clean this data
to get it right and do a bunch of feedback back into the system to fix things that are wrong
whereas in e-commerce to e-commerce and sass at least if it originated from either quickstream
or something that got validated,
like maybe you have like some kind of validated address or something with like a Shopify checkout
or Stripe checkout, like at least you can start a little bit cleaner versus, you know, some of the
ones where it's a really bad starting point. Yeah. A lot of places. Yeah. That's super interesting.
Okay. I feel like this is supposed to be a conversation. I'm just asking you all these questions.
It happens.
So we talked about,
and also part of the reason behind these questions
is because you as a consultant,
you work on a lot of basic analytics stuff for your clients.
And there are lots of consultants out there
whose entire job is helping companies do basic analytics.
It's a huge industry.
Yeah, definitely.
Even attached to specific tools, right?
The ecosystem around support for just getting reporting up in Tableau.
I mean, I don't know how big that industry is, but I would guess that it's huge.
Yeah, I mean, let's think Salesforce, right? Like speaking of Tableau, there are like,
obviously lots of agencies that implement Salesforce, but I think there's a number
of them too, that it's like Salesforce reporting, right? Like they're actually
like building reports in Salesforce or like pick your other like ERP or CRM and it's like a lot of what they're doing is literally
just building reports in the specific software.
This is the question I was going to ask you. How much of that work do you think is
reinventing or doing the same thing
essentially the same thing but just slightly tweaked?
I think a lot of it. So years ago, I actually worked in a very interesting environment in
terms of reporting. So we had a pretty early interface to where customers could create their
own reports. This was 10 plus years ago. Customers could create their own reports in our system.
And I was at the time was on the database side, like database administration side.
And the things people came up with were wild, like the amount of complexity, because we give them a very configurable GUI, but basically didn't put limits on the GUI, where they could basically have infinitely long where clauses, select tons of columns, you know, basically join different entities together and they came up with some wildly complex things which then I'm seeing on the back end of like, who wrote this query?
It's like, well, somebody made it in the user interface.
And it's just destroying our underlying
compute memory resources.
So I know some software
still lets you do some of that,
but there's a lot more guardrails on that.
Now that people have learned,
we can't just give any abilities. We have to put more guardrails right on that then like now that people have learned like we can't just give like any abilities like we have to put some like guardrails and restrictions around
this with self-service stuff and then of course like tableau and things like that introduce
caching layers that makes it so like all right yeah there's this caching layer in between like
the the database or you move your database to snowflake or databricks and like you have a
yep a layer in between but we we were actually having basically a reporting workload that anybody could make one that
would query production databases, and it was wild and bad.
Yeah.
Part of the question is thinking about the way that a lot of consultants, and even in
the past that I approached, helping with analytics.
And a lot of it was standardization.
Yeah.
Right?
Where it can be really helpful.
You go into a company or you go to work at a business
and you say, okay, there are formulas
for the way to measure e-commerce or B2B sales, et cetera.
Yeah, customer lifetime value.
Exactly.
And so a lot of times it's just implementing a framework
that has guardrails, limited scope, discipline, et cetera.
And so I've always thought it's interesting, right?
Could you essentially report on every e-commerce company
in the same way?
I understand that's obviously,
there are so many implications
and even wildly different business models within e-commerce.
But is the reporting that different from e-commerce company to e-commerce company?
Right.
It's not in general, but it is like specifically like there's tools out there.
I think Dacity is one that tries to tackle this.
There's others for that, for e-commerce specifically.
But like e-commerce, how? Like there's so many different flavors of e-commerce. There's e-commerce
like I own a coffee shop and we sell merchant coffee online, but we have multiple physical
locations and we have wholesale channels and we have partnerships with these like consolidator programs. It's like, okay. The long tail.
Yeah.
Yeah.
The long tail.
Yeah.
Yeah.
Yeah.
And so I think it's, I think if you like, Hey, I'm truly like a Shopify store and we
sell some things on Amazon too.
And that's like all we do.
Yep.
Yeah.
Like there's stuff out there for that for sure.
But like, as you add into like multiple channels and especially specialty channels, like there's no integrations, like there's no quick button integrations into like industry
specific stuff at this point. Yep. For sure. And I think back to the standards thing,
I don't know, people just like to make their own. Like, I don't even think that's probably best,
but if you like give somebody a framework, like here's customer lifetime value, people like,
yeah, we're going to tweak that for this reason. And a lot of the reasons aren't super valid,
but people do it anyways. I don't know. I don't know how to explain it.
Yeah. I think that's probably more what I was thinking of were the tweaks to things like
customer lifetime value that aren't really necessary. But the long tail of differences in the underlying components of the business,
even if on the surface, let's say the funnel looks the same,
but even where the data comes from,
the different relationships you have with other companies, suppliers,
different types of customers, et cetera, are really hard.
And that kind of makes me wonder if the idea of automated analytics is possible.
There is a company, I think the company is still around.
I think they're called June.
Oh, okay.
And I think you basically hook up a rudder stack
or a segment or some, you know, I'm sure there are other sources as well,
right? But essentially some sort of standardized schema, you know, event schema. And they will
just generate all of these reports for you automatically, right? So, yeah, retention report,
like all these other things as well from your own custom events and all that stuff that you've instrumented.
Do you think it's possible for a service like that to be successful
because of all the stuff that you just mentioned,
like those meaningful differences?
I think for a while.
I think most of the things like that,
like companies have the right season for them.
I even myself did this this previously like a tweaked
version of this yeah so when we first started doing analytics at a previous company started
with like an all-in like kind of an all-in-one like hook up your google and shopify and various
sources ingest everything and i think they maybe even attempted to kind of like model some of the
data for you and then like bring your own like visualization tool.
Yep.
So like that's a version of that.
And then there's another version of that is even more, which is like full GUI, like login.
There's one called Glue, I think is what it was called.
Yeah.
Where like login, hook up all your stuff and like literally like come back a couple of days later and like all the stuff's populated to some grass for you so yeah all that's possible and it ends up being industry specific when you
have is where you get the value out of it i think yeah like a generic one i don't see that working
like just generically hey this will work for all use cases but if you can really get like
granular and you get somebody that really gets gets a certain domain and what's valuable and exactly how to calculate the common metrics and then really how to
present in a compelling way to get people to actually act on them, which is the hard
part.
You bring up the elephant in the room when it comes to reporting and analytics.
Then, yeah, I think there's value in that.
But I think assuming you're
on like a growth trajectory like i i think eventually you i don't want to say you grow
out of it but you grow to like oh like we want to tweak that and we want to tweak that
and then like yeah and then from a product standpoint the right answer for the company
would be to say no right like this like for them to have a like a good product like they
they need to say no to all the customization.
So to me, I guess an ideal thing would almost be to have the standards, have the models that could be written back into Snowflake or Databricks, wherever you store your data.
And then allow you also to leverage those models other places when you continue to grow yep that might be a model that works which is called dbt yeah yeah you have all the models
you know baseline models in dbt when a lot of the providers are kind of going this way where
like the etl providers say hey we'll extract the data for you. Oh, and by the way, we have all these baseline like DVT models for you. So they're like, okay, cool. Everything's modeled.
Yep. And then when you get there, like you, you kind of have a level of opinionation around like,
hey, this HubSpot or Salesforce data is modeled. And a lot of times like the ETL provider will
create the model and DVT for you yeah so then you have
an idea of like oh like this is how the model should work oh look this is how they calculated
customer lifetime value or whatever because the calculation is in dbt so that I think practically
that's probably the most practical way that this continues to evolve yeah because you have like
community like managed dbt standards for these common like tools yep and you maybe end up with different
like variations of like hubspot or salesforce as an ecom company hubspot or salesforce as a
sas company yep like that seems practical like that yeah yeah and then you take that and you
kind of fork it and like you maybe make some modifications yeah from it but you try to keep
that base intact yep as much as you Well, what's interesting to think about,
and a lot of my questions around standardization
were very leading questions because
it's just an interesting concept, right?
Like, can you actually automate that stuff?
But if you just look at,
it'd be interesting to pull the numbers on this,
but even just thinking about the size of the BI, the market
for BI tools versus the packaged analytics tools, right? So even in the product analytics space,
which there are a number of product analytics tools, right? I mean, Amplitude went public.
Sure.
But still, even if you compare the size of the SaaS-based,
you point some SDK at this tool
and it can generate a lot of reports for you,
the size of that industry is so much smaller,
even on face value, than just BI and customized reporting.
It is weird right because
the i get it in a sense like the reason bi is so much bigger is because there's more to show
and like practically like more people are going to interact with the bi tools like an executive
may log in and look at a dashboard or download some data somebody from marketing somebody from
sales somebody from ops like are all interacting with BI.
Yeah.
And when you get more like niche and you have like just one team interacting with it, then
the market's smaller.
Yeah, yeah.
Kind of my definition.
Totally.
Yeah.
And there, I mean, a lot of the product analytics stuff is all event-based, right?
But when you think about a lot of core reporting, it's combining, it's aggregates, right?
That include both, both you know structured and
yes and time series data right and speaking of time series there are a ton of teams that don't
really deal with time series data yeah if you were to do a chart of like how many people just use like
non-time series like row like rows and columns tables data versus like time series, I think it's,
I have no idea what the numbers would be, but I wouldn't be surprised if it's like 80-20 or even
like bigger than that as far as like only like 20% are even dealing with time series data in
any significant like scale. Right. Yeah. It is interesting to think about that. How many reports leverage a lot of time series data?
I would have the same estimate as you, where it's probably not as much as you think.
I mean, in the world of SaaS, we just tend to think that way because we're tracking user
behavior or in e-commerce, you're tracking clickstream, et cetera.
But even then, the other interesting part of that is, let's say, you know, a company has
some core reporting that doesn't include time series data. A lot of times those companies are
looking at time series data, but it's going to be in Google Analytics, for example, right? Where
they look at that completely separately from their core reporting, and they don't actually
include that in, you know, sort of core reporting that would happen in some sort of BI tool or go
into some dashboard, right? I think a lot of times because it's going from static structured data and
building reports off of that to dealing with time series data and actually getting that data,
you know, into a data store, running aggregations over it. That is a non-trivial step right non-trivial
step it is and what i'm thinking like reporting i mean accounting right is a big component of this
and especially in accounting it's snapshots like you do not like change accounting data after
it has been finalized yes well i actually when have to, and then it's generally not good.
I was at the Y the other morning,
speaking of that,
and this older gentleman had a,
what I believe,
I almost went up and asked him,
but he had an Enron shirt.
Oh,
no way.
And it,
but it was,
you know,
it was like aged enough to like,
he probably,
oh,
totally.
I was like,
this guy worked at Enron,
you know,
and he's like wearing his old Enron shirt, you know, to go work out at the Y.
Anyways, that's what happens. I was like, how about that shirt's worth some money?
I bet it is.
But yes, you can change accounting after the fact, but it doesn't end well.
Right. Yeah. And I think people, most people in data analytics that don't have any event stream experience,
they think, and actually they think in static.
And it actually drastically simplifies a lot.
We're like, hey, we have streaming data.
It's like, let's snapshot it.
Like, I don't know, daily, monthly, weekly.
It's just easier, right?
And sometimes that is the right answer.
Sometimes you don't actually need the clickstream data. You don't need the history. Like, you're just like, ah, let's just snapshot right and sometimes that is the right answer sometimes you don't actually need the clickstream data you don't need the history like you're just like ah like let's just snapshot
it weekly or whatever and same thing with history right like there's a lot of static data that like
changes a little over time that would be great to have the history but people are used to just not
having it yes yeah i am like that's the other component where it's not clickstream but it's
that same like series of time,
but in a different way,
because you're looking at something in a system.
If I were to pull up the screen on Monday versus Tuesday,
the data in the system is different.
And the system won't tell you what it used to be.
People are just used to, oh, like it used to be that.
Like, I don't know.
Like, and you just like shrug your shoulders.
But a lot of like the analytics,
tooling now has this like,
like Snowflake calls it time travel, Databricks has,
and some of the new standards have this same type thing where it's like, oh, I can just look back and see what it was Tuesday and query it. So that's that other time series-esque
thing, but it's with the static data, which is kind of cool.
I think the other thing with static data and time series data is that
a lot of times for core things that you want to report on, even if it something like that, you may not be emitting
or observing or logging an actual event payload for that. But there's a timestamp on the order.
There's a timestamp on the user account. When was that row added to the database, right? So you can
pull that in. And so you have these proxies for events with the timestamps that are on these different objects, right? And a lot of times to report
on how your business is doing, that's all you need. You don't have to have a specific event
payload. Now, when you do capture actual time series data, especially behavioral data,
you can do really cool things around trending. And I mean,
it's an input to all sorts of things, right? Machine learning models. I mean, you can do some
really cool stuff. But for core reporting, a lot of times you have the proxy with the timestamp.
Yeah, right. And a lot of systems were designed this way, where like,
I'll just call them milestones, or you can call it workflow, but milestones of like,
all right, like, so we're talking like logistics
like all right picks up yes shipped delivered you know build and raised like there's in the
system just built that way where it's just like it's just static literally call a column that
says right shipped at and the date so i think and for a lot, depending on your business model, compressing the time from order creation to invoicing, compressing that time is huge for cash flow, for example.
So a lot of analytics and supply chain, manufacturing, e-commerce, those industries, it's like, okay, cool.
How can we compress this time?
Yes.
Yeah.
So that's a super valuable thing for them to,
A, like, especially if their system has like certain built-in milestones
and then you want to get more granular,
then it's like, oh, this is where like click stream data could be really valuable
where we can actually like have more goalposts.
And when we drill in, we can be like,
oh, so like there's this big gap between like,
between order created and shipped.
Like what went on between here and there
could be a million reasons why like it didn't ship but if you don't have those like things
tagged or organized in any way then it's impossible to root cause of like what happened why didn't it
ship on time yep okay let's get into definitions we probably should have started here but you
mentioned this a little bit when we were chatting about this topic.
Reporting is just a term that's used, I think, for a lot of different things, right? So
how robust is our reporting, right? And that could mean a number of different things.
So you have reporting, analytics, dashboards. Break down the terms for me. How have you used
those in the past?
Is there a meaningful difference
between reporting and analytics?
Yeah, and I don't know if I'm sure there are standards.
So I will just speak to what I've like heard practically.
Sure.
As far as like the differences.
Yeah, there probably are.
There probably are.
People who have feel very,
there's a medium post about like the technical definition.
Which is probably helpful
with yeah yeah with references to like books like yes solely written on reporting or analytics
so aside from that this is just the practical definitions here um so i think of we so i think reports as being so let's talk reports analytics throw in dashboards and i'm gonna throw in stuff
that i would call like a study or projects or like okay you know a journey so on the reporting side
i would say what i think of as a report is something that is generated statically.
You could have event stream data in a report, but I don't typically think of it that way.
Something you're typically running with a date time range, right?
And then some other inputs.
Like I want to run it for this date.
I want to run it for this person this client this group of people this department
whatever and then typically you're getting some kind of data set and then maybe some charts and
graphs it was part of like a dashboard yep that's what I'm typically thinking of that
and I would think of this as overlapping circles between reporting and analytics
and that like sure like I just like you could classify a lot of that as analytics, too, I think would be fair.
For the analytics side, I'm going to focus in on what I'm going to call more of like a project or a study or something.
And I think this is super valuable. And there's like these two camps. Like if you grew up a data analyst working for a department like marketing or ops or something versus like you grew up in IT and have done like more like centralized business intelligence, let's say.
I think there's like two different perspectives here. If you grew up as a data analyst and was working for ops or something,
like for that person, I think they typically do more of what I'm going to call these like
analytic studies where they have like a very specific like objective of like, hey, like this
is one came up recently. We have this, we have this business process and it's taking forever.
Like we need to analyze like, why is this thing taking so long we
need to understand what's going on here so then analyst goes in like they pull some really messy
data get an excel they see like oh these people are tracking what they did every day and there's
a tab per day and it went back two years so oh okay all right so like yeah well we've both opened
those spreadsheets and i think our listeners have too
and there's this like that feeling in your gut you know when you see that it's just a very specific
feeling yeah it is very specific very specific feeling so then it's like okay we're gonna find
some way we're gonna write a script do something to get this like all on like tabulated into one
page and then do some ad hoc analysis so we see that like
this is a real story from like recently we see that wednesday was spelled wednesday correctly
and then wednesday not correctly so then you try to combine it all together and you're like you've
got two wednesday columns and then a bunch of other stuff because anytime you have like 300
sheets in excel something got tweaked along the way. For example, Wednesday may be spelled wrong, the whole rest of them. Because one of those that got
copied to the next day, something happened. It's super common.
So this is down the road on this kind of project
where... And if you have that IT background, you start
like, oh, this has to be repeatable. I got to write this script so it can handle every use case.
If they spell Wednesday five different ways different ways wrong i'm gonna handle those
and like and i think and if you're more of a data analyst like we're all just like i just need to
get to the end result like i'm not gonna make this like i'm not thinking about repeatable i'm just
thinking about i gotta get to that answer fast right of like where's the 80 20 on this project
where we can make a difference yep So I think IT people on this thing
get caught up in all the edge cases
that just don't make sense.
And then the analysts...
Yeah, you want to build a scalable...
Yeah, you want to be good and scalable.
Pay the debt that's been created
and create a scalable system.
Yeah.
But this study,
I think this analytic studies,
what I'm going to call it,
is a nice place to be
where you kind of thread the needle to move
quickly to do some of it programmatic but then pause like okay can I like drastically decrease
my time to result by doing some manual steps if the answer is yes take that path get to results
like try to make a business decision and then like bring it back. If this is
now becomes a regular thing, bring it back and push it across the next 20% to make it fully
automated and not be scared of that. Right. Cause if you're from it background, like you don't ever
like producing anything that's not fully automated. Cause you're, cause if you do that enough times,
then you're going to drown. Like you will never like recover.
Sure, yeah. With how much work you have to do.
Right, exactly.
It's preventing future pain.
Right.
Whereas an analyst,
like when you work for the business,
like, I don't know,
let's just hire another analyst.
Like that's kind of the businesses,
a lot of businesses like mentality.
Yeah, yeah, yeah.
So, I don't know.
That's like my one thing where like,
I wouldn't call that reporting, right?
Like that kind of like study ad hoc type analysis. Yeah. Yeah, yeah. Yeah, I think't call that reporting, right? Like that kind of like study ad hoc type analysis.
Yeah.
Yeah, I think that's a great,
I think it's a good framework.
To summarize that, I think the,
I really like the distinction of reporting
being automated reports that are generated
for some specific business function,
or some function within the organization.
Not that those things don't change or people don't make requests,
but it's sort of the core data sets that people are using on a day-to-day,
week-to-week, month-to-month basis in order to make decisions, right? Right. But companies collect, generally, like, collect or have a much wider set of data than what's in those reports, right?
Which would be sort of the universe of analytics.
And then exploring, like, the relationship between those and those other things.
The study, that's a good term.
It sounds fancy.
Like ad hoc analytics or ad hoc reporting, you know.
Right.
But I think study is nice because here's my like a little bit of like philosophical tangent here.
Yeah.
So you've got analytics.
We've been talking about reporting analytics and you've got data science, right?
So the data scientist, like we're like real big on the data part, but the scientist part
is like,
I think we miss
on the scientist part.
Like,
what do scientists do?
They like study things,
right?
Sure.
And I think a lot of-
They try.
And they try.
They experiment
over and over again.
Right.
And most of it is,
you know,
not most of it,
but I mean,
they experiment
with a bunch of stuff.
And you throw
a lot of it away.
Throw so much of it away.
Yeah.
And I actually think that's right for what a data scientist should be.
But when you apply that into a business, that's a tough, and we're like, this is a different
topic, but it's a fun one.
I think that's a tough sell because like, think about that, what I just described on
that like ad hoc thing.
We like with an analyst or a business intelligence,
IT side, business side,
we can come to a result.
We can come to a practical,
like, okay, here's the 80-20 on this.
Let's work on improving this.
Good job, guys.
IT, hey, can you automate this?
Take this to the next 20%.
We want to see it every week.
Great, easy.
Data science version of that,
still a study.
We want to make a predictive
model or something or do pricing.
Like you added a lot more complexity.
You need even oftentimes higher coordination with the business and alignment with the business.
And then it really is data science because a lot of this stuff is, at least for that
particular business, not something they've done before or have a framework for.
And then you do end up with a lot of throwaway work and some of it is like okay there's bad coordination bad communication bad requirements but i think some of it's necessary
and i would like to see people being willing to accept as part of the process to intentionally
iterate quickly through things and like iterate all the way through like get to a
point and then like scrap some of it maybe scrap all of it occasionally and keep going versus what
i often see is we just keep going and instead of like scrapping anything we just kind of like pile
on top and we just make it worse until we get to this point where like we didn't want to scrap
anything which was like even worse than it would have been. Yeah. Yeah. I, that's really,
it's a good way to sort of, it's a good way to frame so much of the exploratory work that needs
to be done because again, you have your core reports, right? How much money did we make?
How much inventory do we have? You know, all the core things, right? But so much beyond
that is doing exploration to determine whether you should actually report on something, right?
Right.
Or to uncover some specific insight that you need in order to make a decision. But it is,
it's hard to throw stuff away though. I mean, have you seen that where it's like,
you know, if you spend a
lot of time doing something and you get, you know, like, let's say you get an answer, but you've,
you spend a huge amount of time doing it. Like, I don't know. I don't know. Throwing stuff away
can be hard. Is that something you've seen? It's super hard. I actually, especially with new tech,
I see this like new technology that people are implementing and i started subconsciously
doing this until i actively thought about it so that let's say you know five years ago we set up
set up snowflake or at my previous company yep previously to use redshift like would have used
the same like same methodology subconsciously we would like go through and set it up and
and we weren't like super familiar with the tech and basically reset it up two or three times
not necessarily intentionally at first but we but what i often see happen is people don't want to
like have to redo anything same thing for like like dbt would be a great example like if you've
never used it before and like somebody sets it up and you just keep like evolving from that initial setup and you don't hit like an early
restart button,
which is not that bad if you do it like a couple months in,
like it just,
it gets way more out of control and messy than if you'd like,
okay,
we're like three months in,
like I actually think we want to restructure a lot of this and like,
okay,
we can rebuild this in a couple of weeks.
Like,
like we can take,
so that,
that's what I started doing is when we're implementing new tech is we'd have
a start to it and then redo it two or sometimes three times.
Wow.
But you just planned that in.
What's a plan that,
yeah,
it was always part of the plan.
So we'd start.
And if you have three environments, it's perfect.
It's like, all right, we're going to build out dev.
And we're just going to keep building dev
and we'll use dev as production.
Okay.
We're in a spot.
All right.
We're going to build out QA.
All right.
So now you can kind of do that.
Yeah, yeah.
So typically for us, it wasn't a big company.
It was just dev and prod.
But if you do your environments that way,
but the temptation is like, oh, we've got to go
on environment setup. Let's set up all three of them like we're ready let's go
and you don't have to do that you can just set up the one like iterate on it and like okay we're
ready for this one or maybe you just have to like dev and then three months later okay let's do prod
let's like move stuff over now we have our environments so the things like that like to
your point like then it feels less bad we're like no like of course we need two different environments
like we didn't waste anything where but practically you're like, no, of course we need two different environments. We didn't waste anything.
But practically, you are kind of starting over.
Yeah.
Yeah, that's super helpful.
Because planning the reset in is a really helpful concept.
Because in so many cases, I would argue a majority of cases,
you just keep going down the same path and then
you reach a point where it's the cost of resetting actually does outweigh the benefit.
Too high, yeah.
Then you have long-term, just long-term pain that you just are, you're going to live with,
right? Because you can't go back and pay that debt, you know?
Right.
Okay. I have. I have a question
that is not related to the technical aspects of this, but you mentioned earlier getting people
to care, you know? Right. I mean, this is the classic lament about dashboard graveyards,
right? I mean, all this work is done by analysts
to set up these dashboards and this beautiful reporting,
and so much of it goes unused, right?
How do you begin to think about changing
the culture of a company, right?
I mean, and I'm not data-driven.
We've talked on the show in the past
about can you be data-driven?
I think we had this cynical data guy
comment on that.
But it is a good point in that,
and I'll approach it from a very specific standpoint.
So when you have discussions within a company
or you need to make decisions,
if there is data that can help that process,
that's really powerful, right? Because so many times things move slowly because there are disagreements about what decision you should
make, right? And if those decisions are sort of, you know, surrounded by a huge amount of subjectivity,
like having some definitive data that everyone can look at and say, oh, okay, like this changes
the way that we think about this, right?
Or this changes what I believe to be the true state.
It can be incredibly helpful for that, right?
It's not a, you know, it's not a solve for every situation.
Like, right.
But we've both been in meetings where it's like, oh, you know what?
Now that we're really looking at this report or looking at this data set,
maybe we do need to reprioritize this project or whatever it is.
How do you get to the point?
Because you've been a leader.
You've built this stuff.
You've worked as the analyst.
You've had multiple leadership roles. how do you think about that yeah i think one like especially
if you're like a data practitioner just realize it is not about the data it's not about the data
and that like the like the data analytics needs to be such a background thing,
in a sense that if people are talking about the data
and analytics, you're doing it wrong.
People need to be talking about,
hey, we need to improve cashflow.
Great, they didn't gain the right data,
getting everything in the right place for that
can really help,
because we can get visibility
into where we've got cashflow problems
and we can improve it. Great,, that conversation should be about cashflow, not about like
reports and data analytics broadly. And I think if you can get people really focused on like,
Hey, if we solve this business problem or we find, you know, we do a cost analysis and we
realize we're like where, you know, where we can improve on costs or we do some kind of like um revenue analysis on like you know we're doing ab testing on our website to like you know all
those business metrics like that's the focus and then the data is there to support it and and i
would say the biggest cultural thing is if you can have leaders and those places that can, A, be really good at, and when I say places like sales, marketing, wherever, they can be really good at identifying what the problem is.
And then they can help quantify, like, hey, we want to decrease cost or improve revenue or improve cash flow.
Here's what I think some drivers for that could be.
And then the data team can help measure it.
That's great.
If you're really excellent on the data team and you have like some domain knowledge, just
say maybe you're a marketer and like on the data team and like you can like be all in
on that marketing domain, then you can be part of that driver conversation, right?
Of like, what do I think the drivers are for this?
That's great too. But you have to have of that driver conversation, right? Of like, what do I think the drivers are for this? That's great too.
But you have to have the right business conversation
if you don't have people
in the like functional business roles
defining like what the problem is
or what we want to improve.
It's really tough as a data team to make a difference.
Yeah, that's really interesting.
Have you in the past had to go, we've talked about this
on the show before, like actually going as the data, like someone on the data team, like going
into the business and figuring those things out as opposed to like waiting for the business to,
you know, to tell you what those things are? The majority of the time. Like I, and I've been,
I have been in
situations where I work with phenomenal like business leaders that have great, you know,
marketing or sales or whatever. And they like knew what they wanted. They knew exactly like,
like how they wanted the team to perform, how they wanted to like measure their business. And man,
those are awesome relationships. When you don't have that, like, and you have enough experience,
you start overlaying
like oh okay i worked with this phenomenal sales leader i know what good looks like in sales and
then you can really like kind of help coach maybe a newer sales leader or somebody in sales that you
know for sales analytics same for like some of these other functional things and that's where
i think practically like just rock star data people are able to like have kind of a playbook of like, you've worked with enough really good functional leaders to like
roughly know what good looks like. And then you can help introduce like, Oh, like what if you
measure it this way or that way? Cause people don't typically say no. If you produce something
like good, you know what I mean? Like it's very rare if a leader isn't quite sure what
they want maybe they're new or maybe they like just haven't really done things in a data-driven
way if you give it to them like hey like here's and it's good like it's oh that's not what i
wanted yeah like i just haven't i haven't run across that a lot yeah yeah that's interesting. And yeah, and sad, actually.
It kind of makes you concerned about, you know, the business leaders. Is that just because a lot
of they're sort of tracking their own stuff in their own way and it's sort of siloed from the
rest of the business? I do think they're tracking their own stuff in their own way. I wouldn't say it's often siloed.
I'd say it's often they're tracking their own non-program.
Through an amalgamation of their impressions with customers.
Or they talk to their team and they feel like things are going well.
Everybody's making decisions on something, right?
Yeah.
But if they are doing it like in a spreadsheet or
something they can usually communicate pretty clearly like what they want if it's just like
hey i talked to the team and like it sounds like i like to pick on sales sales is great but oh it
sounds like i talked to these people on the team and yeah he's got a deal that's about to close and
and that'll be fine and then we've got a couple deals here like i think we're gonna make it this month like yeah you can run a sales team that way yeah and yeah you can and
that can work for a while yeah but i think but long term but then say you have some salesperson
that like you know just you brought in a new salesperson and like maybe they're not the right
fit for the role and you're not and you're just kind of managing based on like well you usually have pretty good salespeople then it
becomes a problem and then you end up with a couple of those or you end up with some kind of weird
alignment issues of of industry and market like you're all of a sudden oh like we're kind of in
a down market now like you just like those changes are what really throw people off right yeah is
that i mean one way to describe that is the tribal knowledge, right?
Right.
I think that is an interesting way to describe what you're talking about, because it's not like those, it's not like the people who don't have a clear ask for the data team aren't good at their job necessarily, right?
No, not necessarily.
Because if they're, you know, if they're delivering great, well, how do they do that, right? And there may just be a lot of tribal knowledge that they have about their function, about the team, about, you know, they have a long history of facing these types of situations.
And so their intuition may just be really good.
Like, oh, yeah, I've just looked at these numbers and I pull, you know, it's like, okay, they log into six different tools. They look at the numbers and they just have an extremely good
sense of, because they've been through so many cycles and seen the numbers change and they get
the behavior. Right. And so nine times out of 10, they're probably like, yeah, I mean, I can,
I have a really good sense of that. Yeah. And I think they develop heuristics too of like, oh, like, yeah, like they couldn't even
articulate the exact thing, but they have an idea of like, okay, if I talk to this many
people or my team talks to this, we'll close like X number of deals and get to our goal.
And they may not even know that number, but they kind of heuristically understand it.
Yeah, yeah, totally.
And I think the other component here too is if you're in an industry that's just really
stable and predictable that like heuristics and like tribal knowledge that can work great for a
really long time yes and then something like covid hits and it's like oh like we don't know what
works and what doesn't work anymore maybe or like everything is different or some kind of
technological disruption or like like for us
like even prior to covid we got hit with a bunch of things we got hit with some increased tariffs
because we were doing importing and like that threw off like a bunch of supply chain like do
we need to resource stuff we got hit with covid we got hit with some other challenges but it so
i think that's the challenge right like if you ended up say a lot of people maybe like the last
20 years ended up in an industry or even 30 years it's stable and like fairly predictable as far as the sales cycle like
yeah you don't really need the analytics per se until maybe you do when everything's different
right to dig in and feel like something else we need to figure out something else like this isn't
working what's the balance there and i'll give you an example So I used to do a ton of analytics reporting studies around the marketing funnel. And one interesting thing about that was interesting that I noticed over time was,
you know, you can start to recognize patterns, right? And even if you don't have models that
describe these patterns specifically, it's the ability to connect different things. So I'll just,
one example is, we did a lot of stuff with SEO and sort of like building a couple of directories
on the website up from an SEO standpoint.
And we had some reporting that sort of connected traffic
down through to revenue eventually, right?
But that's a pretty long timeline.
And so you're looking for signal before revenue actually occurs, right?
Because if you generate traffic and then it's months and months, right?
But you need to be able to get a sense of where the business is going to land on a time
period that's a lot shorter than the total time.
And so you look at it enough and you start to notice like okay like if these certain directories behave in
this certain way then i've looked at enough i've looked at enough of these things all the way
through the funnel just manually to get a really good sense of that it wasn't worth like building
really robust reporting or even trying to model it, right? At that point.
Right.
But that was incredibly useful knowledge. And it was, there were only like maybe two people who
could speak to that with a really high level of confidence because they had just looked at it.
Right.
Repeatedly in detail and manually followed those threads over and over again for a period of time,
right?
Right.
So back to my question, how do you balance that? Because that's extremely valuable. In fact,
I would say that's like, okay, you have someone really good if they can do that pattern matching
and you can trust their intuition around that. It's like, wow, that's someone who's super talented.
In the near term, a lot of times it's not worth it to go
through the effort of actually building some sort of model around that. But it's highly risky
because that knowledge lives with someone who, or like COVID hits and changes things or whatever,
right? So how do you balance that? Like, how do you think about when do you start to automate some of those things that are more tribal knowledge or heuristics
that people have developed because they're just really good at their job? And part of their job
is looking at things in way more detail, way more consistently than other people.
Yeah. I mean, it's definitely difficult. Well, I think one of the things that i've found valuable i really like looking back
on it wish i had done more of it i did some of it wish i'd done more of it is those things that
we know that are that like basic things like all right like e-commerce company like daily sales
or like sass like daily conversions like you know daily signups or something yep like there's a
lot of boring things that like you look at a graph and it's like pretty much flat like i wish i had
gone and done more of that boring stuff and put monitors on them right because it's it works most
of the time but things break so at least in like sas and e-commerce world like i think a lot
of the data analytics like all the focus is on like let's mine this out let's figure out like
which cause like personalization which customers should we contact i think there's a lot of space
and just monitoring some really boring stuff of like let's just monitor make sure we get the our
average daily signups every single day and like and we didn't break the sign up button there's
like practical things like that or check
out with e-commerce right it's such a good point right and that's a now and depending on
that kind of crosses it like crosses some boundaries sometimes between like between
like product analytics and maybe like like a systems monitoring stuff so like there's some
fuzzy boundaries but right but in mind, it's all data.
So I think that would be one thing as far as like a focus is like focus on
monitoring the boring,
critical stuff and making sure it works every time in some way.
I think that's valuable.
And then I think secondarily,
then you're getting into the drivers behind the boring stuff.
Like daily signups, like, all right, great.
Like we need to hover in this range.
Like some kind of alert go off if you're out of range.
And then like inside that, okay, where are the drivers?
And like you're saying, like SEO traffic, pay traffic, et cetera.
And then that's the next level down of like, all right, let's monitor those
like roughly in range what they need to be.
Yeah, I think that's a great point because if you think about the people who
you know, who were the, you know, they'd have that really good intuition, they'd have those heuristics,
almost always those are the people who say, I think something's broken and they catch it.
Yeah. Because they're essentially acting as the monitor, right? They're acting as the monitor.
Well, and I've been like like been part of some pretty bad situations
like especially if it's a business with like high turnover like you know say business got acquired
and then they had a big turnover or they just had turnover for some other reason and you end up with
a lot of new you a lot of new leaders at once like your heuristic method won't work. Right. Yep. And then things break or like,
and it's especially bad if it's like, like e-commerce, like you break checkout or like,
yeah. Or you don't build customers. Right. Cause somebody was new in accounting,
like all those types of things, like you can get, like if you standardize and get that data
in reporting, you can keep lots of consistency if you end up having turnover in key areas.
Yeah. Because at least she'll know like, oh, oh cool like we have the signups that we normally get or we have
the rough checkout and then beyond that like you'll figure out like common things that happen
right um that's such a i love that framework of we talked about the different definitions you know
reporting analytics dashboards,
studies, but the concept of observability on some of that stuff, especially like the boring,
critical stuff, I think it's a really helpful concept. I wish that I had, looking back, it's
like, man, that really would have been great to treat some of that as an observability problem.
I know. I did it a few times, like was really happy with all the times
I did it and didn't do it enough. Yeah. Yeah. Because the reason I like the concept of
observability for those is you, that context has a very different weight than this is an automated
chart on like a dashboard, you know, which there's a huge amount of crossover there.
When you say observability, right, generally you're, you know, you're implementing that on stuff where you need some sort of alerting system.
Yeah.
Like time to or whatever.
Yeah.
And I just love the different weight that has relative to, you know, some of those core things that are boring.
But that also, I mean, it is shocking how many times things can go weeks and then someone's like, then it becomes a really big
problem. And then it's a fire drill and it's like, man, you know, no one caught this.
It's, and it happens at every company I've ever worked for. And the other bizarre thing,
so, and the reason like it's forefront in my mind, the reason I even have like,
I think part of the reason I have thoughts around this is i did a lot of devops work too observability for devops is like that's what you do like that's right it's crucial so when
like we were doing like aws azure type work like we instrumented all this stuff like server
monitoring like disk space and cpu like all the like sure the basics and you have alerts and you
have like call schedules with like this person's on call that like it's very like elaborate like
the amount of data you can get and then you go and look at bi tools like name a bi tool that has a
like a true first party supported like alerting mechanism that's really robust like most of them
can kind of do it but i don't know of any bi tool
that's like treated this as like hey this is a first party problem that like we're really good
at yeah yeah i've yet to see one they're probably one that exists but i've yet to see one and one
of the like major yep major tools some of the packaged you know sass like a product analytics
tools do have some anomaly detection type things like which are helpful but yeah again you can't
capture all this stuff,
you know, necessarily all of the boring stuff
that you need to capture.
But it's fascinating, right?
So like we used, you've got like New Relic and Datadog
and a lot of these DevOps tools for monitoring.
But I mean, it's elaborate what you can do
to like customize exactly how you capture the metrics.
It's all event stream, all stored, the history of it.
And then elaborate where you can like split things and alert certain teams and like have it you know call like three people in order if they
don't answer like you know using like some of these like services and like wouldn't you want
that same level of like veracity applied to like hey like you know we expected like a thousand
signups today and we got two like when you want that same level of like urgency.
Yeah.
Yeah.
I've just never seen it with this semester.
It is really.
Yeah.
I mean,
that is really interesting.
One of the,
like one of the other things around observability that I love is the,
like heightening the awareness of the impact of unforced errors. Someone I used to
work with talked about, okay, like, if you, you know, this is when I was doing a lot of,
when I was really deep into a lot of marketing and building the funnel and all that sort of stuff,
right? And they, this is really fascinating and such a helpful framework. He said, okay, part of being really good at this
is understanding all of the inputs. How are you going to, you know, hit your number,
exceed your number, having a creative, like a creative element to come up with ideas that can,
you know, help you sort of achieve this quarter over quarter growth. It's like really hard,
right? He's like, so those things are really key. The other thing is just avoiding unforced errors at all costs. And he's like,
actually, if you can avoid unforced errors, it's so helpful, right? Because they can,
they can regress your progress so quickly, right? Like if you make a really bad unforced error,
you know? Yeah. And a lot of times a really bad unforced error, you know?
Yeah.
And a lot of times it is the stuff that breaks, you know? And that's something, I mean,
we experienced that in a significant way with someone thought there was some problem on a part
of the website that was a really key thing, you know? And I was like, oh yeah, it kind of seems
like a, you know, whatever, right? And then it actually later turned out to be like a huge problem.
And, you know, you sort of, just the recovery from that is so painful, right?
And it decreases the amount of compounding.
So, yeah, I love observability applied to that as well.
Yeah, and it really applies so well to pretty much every, like, business unit.
Sales, marketing, accounting for sure yep and and
the other advantage of it is most people can articulate to you so if you're talking if you're
trying to get like down definitions of like customer lifetime value or even like sometimes
margin like you can get down some real rabbit holes of like how do we capture landed cost and
then like you know and then you're months down the road and nobody knows what landed cost is
but like for observability of like hey what would be really bad if like this
thing didn't work or like you didn't like accounting we'll know that for sure yeah sales
like you can get to the end result quickly of like hey this needs to be right or this needs to be to
work and then like the data team can have some real meaningful work to like totally to to build
out the observability yeah yeah i love it all right
i don't even know how long we've been recording for by the way i don't know um i think over an
hour wow uh yeah so we should probably land it here yeah any last uh parting words of wisdom
on reporting and analytics i wish i had something it's just i I mean, I guess if you're doing reporting and analytics,
I would say one, don't be afraid to do something more than once, especially if it's new. Yeah,
that's a great, that would be one. And then my second one would be when you give yourself that,
like, okay, like we can do it more than once. I think people are willing to take more risks
in a positive way of like, Hey, let's try a new tool or do something a different way. Because I know when I get to production, it didn't have to be
perfect the first time. Yeah. Yeah. Yeah. Okay. I'm going to try and eke out a word of wisdom.
Let me think here. I'm ready for it.
With this is, I haven't had a formal role on a data team, but one thing actually,
I used to work with someone who, they were in an operations role.
And so back when I was doing a bunch of marketing stuff earlier in my career,
a lot of times with marketing reporting, it's kind of chaotic and the business is changing
and trying to grow and all that sort of stuff, right?
And so you tend to go back and reevaluate definitions, right?
Is the way that I'm measuring this thing,
is it really reflective of the business or whatever?
And then that can create too much complexity or whatever.
But it can also be helpful.
You're just trying to figure out ways to describe what's happening.
But I worked with this ops person who was just absolutely incredible.
And they really fought me every time I wanted to change something.
And we had to agree on it. And it was really, I don't want to say frustrating, but
they really forced me to articulate why we needed to change a definition. And it had to be a really
good reason to change it. And one of their big things, and in the moment,
you don't really think about this a ton,
but they later told me one of the reasons they dug in so hard
was because even if your definition for something
isn't exactly accurate,
if you're consistent with it over a period of time,
it's way more meaningful for understanding trends
because all of the variables are consistent
for a period of time, right?
Because they're like, it's way easier for us
to measure the wrong definition
and then run an analysis on the historical data
with that definition changed
than to go back and piece together a story that
changed multiple times, right? You know, which when you're trying to, you know, you know, I just
didn't think about that. So yeah, if you want to obfuscate your success, then change the definitions
all the time. If you want to obfuscate your failure, then change the definitions all the time.
That was a way more succinct way to say
but it's true and i've never worked with anybody that did it intentionally but there were definitely
some times where like we made so many changes we'd look back and like oh look at this trend
and be like oh no we changed that remember like oh yeah that's right totally and it just ruins
well and it goes that kind of goes back to the tribal knowledge type thing right where it's
actually hard to where do you centralize those changes?
Like different people do it.
You know, there's often like a lack
of centralized documentation about,
you know, these definitions
and then all of the places
in the entire supply chain of analytics,
like where the surgery is done
to make the changes.
And, you know, so anyways, that's,
yeah, that was a good,
that was a great lesson.
So I'm a fan lesson so yeah i like
that one yeah all right well thanks for joining us we hope you enjoyed we hope you enjoyed the
chat with just john and i reach out to us if there's a subject you want us to tackle yeah
we'd love to hear from you we always love hearing from our listeners tons of great shows this fall
and we'll catch you on the next one see ya ya. The Data Stack Show is brought to you by Rudderstack,
the warehouse-native customer data platform.
Rudderstack is purpose-built to help data teams
turn customer data into competitive advantage.
Learn more at Rudderstack.com.