The Data Stack Show - 265: From Anxiety to Advantage: Navigating Data’s AI Revolution with Barry McCardel of Hex

Episode Date: October 8, 2025

This week on The Data Stack Show, Barry McCardel, CEO and Co-Founder of Hex, joins Eric Dodds and John Wessel to discuss the transformative impact of AI on data teams and the broader data industry. Th...e conversation explores how AI is reshaping workflows, team structures, and the very definition of data roles, while also addressing the anxieties and opportunities that come with rapid change. Barry shares Hex’s journey from having a dedicated AI team to fully integrating AI across the product, and offers insights on industry consolidation, the infinite demand for data insights, and the importance of embracing change. Key takeaways include the need for data professionals to adapt and focus on problem-solving, the growing value of context and curation, the competitive advantage for organizations that leverage AI-driven tools and foster a culture of innovation, and so much more. Highlights from this week’s conversation include:Welcoming Back Barry to The Show (1:07)Discussing Change, Uncertainty, And Anxiety In Data Roles (3:13)Exploring Excitement And Opportunity In AI Adoption (6:20)Redefining Data Roles And The Infinite Demand For Insight (9:37)The Impact Of AI On Data Workflows And Team Mindset (12:52)Evolving Team Structures And The End Of The Magic Team (16:49)Integrating AI Into Product Development At Hex (20:44)Comparing Industry Approaches To AI Features (24:56)How AI Changes Daily Workflows For Data Teams (28:52)The Virtuous Cycle Of Context, Curation, And Self-Serve (32:39)Managing Context And Evaluating AI Agent Performance (36:52)The Expanding Role Of Data Professionals In The AI Era (40:45)Industry Consolidation And The Modern Data Stack (44:32)Lessons From Acquisitions And Platform Shifts (48:18)Key Takeaways And Looking Ahead To Future Episodes (52:16)The Data Stack Show is a weekly podcast powered by RudderStack, customer data infrastructure that enables you to deliver real-time customer event data everywhere it’s needed to power smarter decisions and better customer experiences. Each week, we’ll talk to data engineers, analysts, and data scientists about their experience around building and maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

Transcript
Discussion (0)
Starting point is 00:00:00 Hi, I'm Eric Dots. And I'm John Wessel. Welcome to The Datastack Show. The Datastack Show is a podcast where we talk about the technical, business, and human challenges involved in data work. Join our casual conversations with innovators and data professionals to learn about new data technologies and how data teams are run at top companies. Before we dig into today's data. episode, we want to give a huge thanks to our presenting sponsor, Rutter Sack. They give us the equipment and time to do this show week in, week out, and provide you the valuable
Starting point is 00:00:38 content. Rutter Sack provides customer data infrastructure and is used by the world's most innovative companies to collect, transform, and deliver their event data wherever it's needed all in real time. You can learn more at RutterSack.com. Welcome back to the DataSack Show. we have a three-peat guest on Barry McArdle, one of our favorite people to talk to. So, Barry, I think it's every year our Spidey Sense kind of says, wait, we haven't had Barry on in a while. So welcome back. It is great to be back. I think there's something really fun about doing this roughly annually because it's a good chance for me to even look back.
Starting point is 00:01:18 What were we talking about a year ago? What's changed last year? Yeah. It's such a crazy time just in the world and technology that. a lot of change and so it would be fun to talk about what's going on. Yeah, for sure. Well, for those who didn't listen to your last episode, which by the way, if you haven't, we talked about why building AI products is hard. Barry went super deep into a lot of the how-to stuff in the last episode. So definitely pick that up. We'll put it in the show notes. But Barry, for those not familiar
Starting point is 00:01:47 with you, just give us the quick flyover of your background and Hex. Yeah. So I'm Barry McArdle, CEO and co-founder of Hex, I've been a data person, data nerd sort of my whole career, did a lot of different things. And then before and then started Hex with a couple of co-founders I'd worked with at Palantir. And Hex now is used by over 1,500 customers globally to do their most interesting and sort of in-depth data work. And I think especially what we'll get into this, but especially in the last year, I would say, AI and things, thinking about how AI can vastly improve those workflows has become our primary focus. And, you know, from a company perspective now, I think we really do think of ourselves as sort of the leading AI tool for doing data analytics and data science work.
Starting point is 00:02:41 Awesome. So Barry, I'm excited to talk more about that. One of the topics we talked about before the show, which I think will be really relevant for a lot of our listeners, is all of the change coming in the data space around AI. So specifically people that concern they're getting behind, concern their jobs going to be replaced, or maybe just concern their team size is going to be reduced due to some efficiencies from AI. So we're going to dig into that on the show. What else do you want to talk about? Well, I'd love to start with that because I think there's sort of this macro picture of what's changing with technology, what's possible today in some really exciting ways. As you mentioned, though, I think it calls into question a lot of sort of our
Starting point is 00:03:25 previous assumptions on what people's jobs look like, what tools are they using, how the lines in the stack are drawn. We're here on the data stack show, right? It's like, even like, what is the data stack anymore? Right. What does it mean? So, you know, I think that's really rich to dig into. And one of the fun and slightly nerve-wracking things about having conversations like this in public is you're basically making a bunch of predictions and statements that It'll be easy to go back and check, but I love that. I think it's fun. Yeah. Yeah, totally. Well, it would be fun to unpack that and make some predictions or to muse about this a little bit, and maybe I can come on another year. We can see we can talk about all the places in which we're right around. Yeah. Sure. Awesome. Yeah, and we'll go back and fact check our previous episodes, too. Maybe we'll have like a roundup. That'd be great. Cool. Cool. All right, well, let's hop in.
Starting point is 00:04:10 Barry, last time we went pretty deep on the technical stuff in terms of building AI products. Magic has been out for a couple years now. You had sort of maybe gone through the first major iteration of learning on, you know, how do you do e-vowls and how do you, you know, how do you tweak all of these knobs? We talked about the judge, you know, which was super fun. And you've gone through this entire, you've gone through another year of learning. There's some team structure stuff, but we were chatting before the show, and one of the things that I always love about talking with you is you go back to the user or back to the person. And you mentioned that.
Starting point is 00:04:56 There's so many technical things to talk about with AI. There's industry things that we have to dig into acquisitions. But really quickly, as we were chatting about what we wanted to talk about, you said, I just want to talk. Let's go for the person who's a data person, right? and this is a difficult landscape to navigate. I think that makes you a great CEO. I actually think that makes HECS an amazing product as well. But let's start there.
Starting point is 00:05:18 John, you mentioned there's a bunch of anxiety in your LinkedIn feed. Are you seeing the same thing, Barry? Yeah, and not just in my LinkedIn feed. In this job, I get to talk to a lot of customers and users and prospective customers and users. And it's like, and we also employ data people here. I mean, we have a data team, right? And, like, I think everyone presents, this is just natural, right?
Starting point is 00:05:42 Like, everyone presents, very few people show up to, like, a call and be like, I'm anxious. Yeah. Talking about the presentations of it, it's important, though, right? Like, yeah. And I think, like, you know, one of the dimensions, psychographic dimension, you kind of break different people down on is, like, interest in change and propensity to embrace change. Wherever you are on that spectrum, when things are changing, which they definitely are, and we'll talk about some of those changes.
Starting point is 00:06:07 it just it creates uncertainty like fast change means you don't quite know where things are winding up maybe you have a hypothesis strongly held weekly held but like you don't know and change leads to uncertainty uncertainty leads to anxiety it's like the whatever the Yoda version you know it leads to the dark side you know it's like it leads to this anxiety and insecurity and who among us can say we've never felt that like I'm the CEO of a tech company right now there's a lot of change and a lot of uncertainty, and you can have these anxious moments. And I think that's really true for people in a lot of spaces right now. I forget about data for a moment. There's a lot of professions right now where you can find blog posts and podcasts and all sorts of things saying
Starting point is 00:06:49 that's going to be completely replaced by AI. No more accounts, no more paralegals, no more radiologists, you know, pick the like high status job. We're not just talking about the sort of like truck drivers anymore, which I think it's easy for a lot of folks, maybe the type of people who might be listening to the Data Stack Show to then, like, more abstract. Well, now it's a little closer to home, right? Yep. And I think this is what I see when I talk to a lot of data teams and leaders and people. It's like, there are companies out there advertising AI data scientists.
Starting point is 00:07:19 There's companies out there advertising. I've given this talk internally a lot. I've said, you know, imagine people are out there advertising an AI CEO. Yeah. One, I'm sure I would do a better job. my job, then I'm doing it. But like, I would feel some type of way about that if I was driving through the city seeing billboards for AI CEOs. And then what I say to the team is, but I don't have to, that's not abstract to you because we have software engineers, we have sales folks, we have
Starting point is 00:07:46 SDRs, we have data people. I'm like, you, these do, for all of your roles, there are billboards that say they're going to replace your job. How do you feel about that? And they may sound a little provocative, but the reason I say that internally is because I'm like, we have to understand And that's also where the people that we're trying to serve are right now. And on one hand, it's really exciting because you can look at AI and say, this is going to democratize things. If you go to the header of our site, we say, make everyone a data person. And I thought a lot about that tagline.
Starting point is 00:08:13 It was like, you know, because I think you could interpret it maybe even in the wrong way. We look at it in a very aspirational and ambitious and exciting way, which is you have this opportunity to take every, you know, put data in everyone's hands. We see that as what the data team should be endeavoring to do, but you could also view that easily as a thread and say, well, if everyone's a data person, I considered myself a data person, where does that leave me? Right. So I think that is sort of the macro backdrop, the tableau for lack of a better term, that everything is noted, that everything is set on that everything is set on and like it colors all the conversations we can have. Even outside of what we're doing at Hex, I just think it's like the interesting thing
Starting point is 00:08:53 when you look at what's happening in our little corner of technology and what does it mean? And I'd love to dig into that. So I'd like to dig into what do we think that? I mean, so at a high level job displacement or significant change in the job that you perform, that causes a lot of anxiety.
Starting point is 00:09:14 I want to dig into other drivers, but maybe let's look at that by inverting the question and saying, Barry, as you talk to customers, as you talk to your own team, is there anyone who's super excited? And what are the reasons that they're super excited, right? They're sort of the opposite of someone who may have this low-level anxiety about all of this.
Starting point is 00:09:37 You can be both, I think. True. I'm both some days, right? So when we talk to our own team, we have a data team, we have an incredible data team with people I admired pre-hex like Caitlin Borman and Katie Bauer who now work here, the data team.
Starting point is 00:09:53 And I think part of what's fun about being on a data team here right now is like, we really embrace this and we're like, well, we're going to be almost the lab to figure out what the data team thing. I've told them, you guys have job security. Don't worry. Like, you know, this is an R&D effort. Yeah. But, you know, we're excited. But even then,
Starting point is 00:10:09 it's easy to sort of be like, well, where certain workflows we see changing? And it's like, well, okay, where's our place in this? We talked to customers. I was just on a customer call before this. They're saying, well, we're so excited to adopt these AI data tools, we're trying to be on the vanguard of it, and we're trying to disrupt ourselves. And I thought that was a really cool statement and almost like a brave and the correct thing to say, that just to get yourself in that mindset of like, we're not sure
Starting point is 00:10:31 where this goes, but we know that we either embrace these tools and we invent that ourselves or we're going to be irrelevant either way. I was interesting talking about some of the things that they and their team are really focused on of like, how can these AI data tools make our teams better and more impactful and get out of that defensive crouch? I think that's exactly right. I think that, and if in case it's not clear, it's like the classic Mark Twain, I think it is like most Mark Twain quotes is probably apocryphal, but like, you know, rumors of my death have been greatly overstated or whatever. Yes. I do think in a lot of these things, it's not going to work that way. There are different things and different loads of distraction to operate
Starting point is 00:11:06 at. And I think it's actually the teams on most, the people are most bullish on it, willing to dive in and rethink that and embrace that change than not. It's a hard thing to be in. And I think that I don't necessarily blame some folks on teams that aren't that way, because I think a lot of it comes down to the team leadership and the company that are providing that psychological safety, for lack of a better term, for people to go and do that. I hear that all the time, and I do think at some level, it's like, the very simplistic, incorrect, but simplistic formulation of this is like that we kind of saw happening two years ago, and now is obviously very vivid. It's like LLMs can write SQL and Python and build charts. My job was writing SQL and Python,
Starting point is 00:11:46 And building charts. Totally. Like, you know, like the logic is kind of simple. It's like, well, therefore I want a job. And it's like, well, if that's all you think of the job being, then yeah, maybe you shouldn't actually. But it turns out that really great data people do a lot of other things. I think actually AI in some ways is a way to become what we kind of always wanted to be. If you go back three years and you ask data teams, like, what's the worst part of your day?
Starting point is 00:12:13 They're like writing boilerplate sequel. Like, you know, debunking pipelines. Yeah. These people, these stakeholders just, they won't goddamn self-serve. Yeah. They won't learn the tool. And it's like, here now finally we have a technology that like, you know, makes all that better.
Starting point is 00:12:27 And it's like, well, hey mine. Right. It's like, well, I'm like, no, the future is here, guys. Like, you know, we see this internally. Like self-serve is real. A lot of hacks right now. We use our own tools for us. It's awesome.
Starting point is 00:12:36 It's completely changed our relationship with our own product and with our data team. And it's like, the dream is here. And I think like a lot of data people, have this love-hate relationship with all the questions they got from the business. It's like they were annoying. But like, yeah. Yeah. Maybe there's two or three of these, but I can think of, I can think of two.
Starting point is 00:12:56 One, like that, the AI, like, enthusiast response, which you kind of said this is a throwaway, Barry, but it's really important that, like, you told your people like, hey, like, you're here. Like, I'm committed to you. Like, you're going to have a job here. Therefore, go disrupt yourself, right? Like, that's a big deal. think if you want innovation and adoption of AI. But the second response, and that's not everybody's culture for various reasons, right? But the second response that I'm seeing, kind of that more like fear response is like,
Starting point is 00:13:27 I'm an expert, I've invested all this time and energy and, you know, the sunk cost fallacy thing. Like I've invested like years of my life being really good at Python, really good at SQL, like in whatever else in that ecosystem. them. And on one hand, like, yeah, like there is a complaint of like, oh, I'm tired of debugging this or continually rewriting, you know, the sequel. But the other thing, I think, for technical people, there's a lot of them really enjoyed it at the same time. They actually really enjoyed the, like, complex troubleshooting, the, you know, whatever other components of that. And they don't necessarily enjoy like doing requirements interacting with people type of thing. So I think
Starting point is 00:14:08 there is that flavor of data person. And, you know, I've been around long enough, like, DBAs were, you know, a big thing when I started, database administrators. And especially like the like stereotypical like XKCD DBA for like, like personality. Yeah. Of like, you know, leave me alone. Like I'm going to be optimizing queries in the basement. Some of the some of those people, you know, and people change. But some of those people as they like got into like data ends roles or like other data roles, like they don't want to be the person like interacting. It's annoying to have to deal with it in users.
Starting point is 00:14:41 And I think it's going to be a struggle. Not that there's not a place, but it's going to be a struggle. Someone who's got that fixed mindset of like some cost mindset or whatever you started of like I spend this time learning Python or SQL or whatever. Like I would just kind of question is that really what you spent all that time
Starting point is 00:14:57 learning? Like yeah, yes, no doubt you learned like the Pandas syntax. Like I did that too 10 years ago. Yeah. But is that really what you were getting good at or were you getting good at problem solving? And I'm a lot of listeners may know him. He's been sort of figuring in the data world for a while.
Starting point is 00:15:13 Great guy and a good friend. I did a, we did an event with him last year, just sort of like a fireside chat thing. I could have asked like, well, what is it about getting into data? Like, why did we all get into data? What does it mean? Like how, you know, and he put it like, so I just like solving problems.
Starting point is 00:15:26 I think it's like puzzle. And if you kind of look at it that way, you're like SQL or writing the panda syntax is like a way to do that. And there's something fun about coding. Like I know a lot of people listening agree. Like, it's like, you know, it's a little recursive problem solving thing. And it's like, well,
Starting point is 00:15:40 Yeah. Some of that, and AI can do better at specific parts of that now, but like, I just think there's a lot left. And actually, like,
Starting point is 00:15:50 if you look at it that way, like one thing I say a lot, just, I don't know, in different places is like, there's just like an infinite demand for insight. And I don't think people
Starting point is 00:15:58 had the number of data people they have on their teams because that's like, someone did like a supply demand thing. Well, on average, we get like 100 impactful data questions a week. Right.
Starting point is 00:16:07 And each data person can answer 20, therefore, okay, we need five data people. It's like, no one does it that way. So you have the number of data people you have because like the CFO said you could hire that number. It's like actually like the vast majority of things that could be influenced by data aren't because, yep,
Starting point is 00:16:24 there's just an infinite demand. And I think of it that way for like software and software engineering, right? Like every engineer at Hex uses AI tools for that now. Therefore we've like slashed our engineering team and shrunk it. No, of course not. We're hiring engineers as fast as we can and we're paying as we ever have. it's like well that's weird it's like no it's not weird because we're getting more out of them now they're like more impactful than ever yeah yeah our engineers have embraced this and are there
Starting point is 00:16:47 parts of the engineering job that like aren't part of it anymore or whatever like have been kind of abstracted away yeah you're mentioning dBAs I started my career in software and it just before the cloud transition like like 2013 yep and um when I was at Palantir when we started we were all on-prem all like you know physical servers I still remember the IP at the names of some of the boxes. Yeah, right. And you know, your SSHing into these things, whatever. And then we had our first cloud deployments around that time,
Starting point is 00:17:18 like early mid-2010s. And there were engineers at the company, and it's not just there, you know, all everywhere that were like really good at managing these like physical infrastructure and boxes. And then all of a sudden that's not the thing. And like, did we lay them all off? Like, I don't think so.
Starting point is 00:17:36 I have vivid, very vivid memories of those people then, becoming the best cloud infrastructure people had. And like you just had to, there's like a slightly different set of skills and translation, but it's, and you move up a level of abstraction. But if you're really good at the thing and you have the right mindset, I don't think there's anything stopping you from doing that.
Starting point is 00:17:55 And one prediction for the way data teams evolve over time is maybe they just feel, people actually just become more, it becomes a slightly more general, almost like ops and finance type. I'm not literally finance, like they're good at accounting or whatever, but sort of like,
Starting point is 00:18:14 how is the company running? How do we measure, right? How do we improve? Like a much more strategic thing. You think about what a lot of data people aspire to do. It's like, I want to solve problems that help move the business forward. Yeah, right.
Starting point is 00:18:24 Yeah. Like just the reality three years ago is 80% of that would wind up being like a lot of data plumbing work beyond a dashboard that no one's going to look at. And now there's just like a whole new world of possibilities around how you can actually like get a lot of leverage out of your skills. I think that's like super rad.
Starting point is 00:18:40 Yep. Yeah. Okay. I want to counter argument is in the right word. Let me tell a story that sort of the flip side of that I think maybe gets a little closer to the actual job impact, right? So a while ago, I switched roles and I was tasked with sort of reorging this team. And after digging in, I realized, you know, we sort of need a fresh start here. And I, I realized that the team that was there was really not optimal they weren't using AI in almost any way
Starting point is 00:19:15 at all, right? And so there were these opportunities to do some dramatic, you know, to have dramatic increases, you know, in efficiency just by implementing workflows, you know, and leveraging some AI tools. And so we ended up sort of doing a hard reset on the team and instead of going out,
Starting point is 00:19:35 and rehiring a much people, I found these two, actually three incredible people who already worked at the company but in other departments. And we just made internal hires and we redid all of these processes. And so the equation was, okay, we could go out and just backfill all these roles or we could save a gigantic amount of money and leverage AI to make a smaller number of people who already have a huge amount of domain knowledge extremely efficient, right? Now, fast forward,
Starting point is 00:20:11 whatever it is, a year and a half or something later, the size of the team is going to grow now, right? But that was, let's say, like a microeconomic impact, right? It was a big enough change, like, at least my perception on it, is to where you had a
Starting point is 00:20:26 company bottleneck and you moved it. Like the way it was working, like someone of a company bottleneck, move the company model neck kind of downstream from there. Yep. And whenever you move that bottleneck, like, that's my definition of success and something like that. Yes.
Starting point is 00:20:44 Yeah. Anyways, the question for you, Barry, is there are, and I think that in some cases, in some companies, people will just look at the peer cost equation and just sort of make the decision, right, without necessarily thinking about things longer term. But that's just a personal story for me, but I've heard of that happening in other places as well,
Starting point is 00:21:08 which I think is part of the anxiety, right? Because it's like, oh, well, at the end of the day, if it can make the bottom line look better, that's going to have an impact. You know, companies making short-term decisions to cut costs might not be the right optimal thing. Like, that's not new in the world of AI or specific to data. Like, that's just like a thing that.
Starting point is 00:21:29 Yeah, you know, is a push and pull in a company between like, you know, cost, upside, whatever. Yeah. I think maybe the thing that we need, I say we is just like putting just like my data person had on a part of the sort of data community such as it is like, let's have a new and crisper definition of like, what's the upside value of our role? If you're a team at a company that is viewed is just like, yeah, they just do SQL queries, then yeah.
Starting point is 00:21:55 Here's the thing I see, actually, we think a lot about here. there are teams that are the bottleneck. There are teams that the rest of the org looks at them and they're actually like really low NPS, if you know that term, like net promoter scores. The company looks at them and they're like, I don't get what I need out of them.
Starting point is 00:22:13 Yep. I ask them a data question. It takes three weeks to get something back. And then it's wrong. And I ask for an edit to it and it takes a long time. They can't do self-serve. And these are the teams that people are trying to work around right now. Yeah.
Starting point is 00:22:26 Where you actually see this thing of like business users are like, well, I just want to bring in, I want to hook up chat Chb-T to Snowflake or whatever and just like be able to do things. I think that is what makes a lot of, it's almost like a spiral of insecurity where like the data teams that are getting worked around or then like, it's like worth examining like, well, why would you day team doing that? At some level, there's like cultural stuff, team stuff. I can't change that.
Starting point is 00:22:48 Sure. I would say at hex, a thing I think about like what do we have leverage on. It's like, how do we be the tool and the solution for these teams to meet the moment? I think a lot of speed. Like, I think speed matters. And it's like, well, if we can build you an AI data tool that helps you get an answer back to these stakeholders in two hours instead of two weeks,
Starting point is 00:23:08 boom, we've just like increased your impact and we've increased your MPIs. If we can give you AI data tool that you can expose to those stakeholders where they can ask natural language questions in a way that's integrated and observable and governed, boom, you have now provided, we've given you the release valve that you can like do
Starting point is 00:23:27 like satiate that thing. Yep. We can give you the tools to do governance and observability and endorsements and all those semantic modeling and all those things. So you know that this is all trusted, boom, we have relieved that like anxiety. And so do you think there's like a how do you break these tradeoffs and these tensions? Yep. So, you know, maybe it's a selection bias thing because like you could argue that a lot of the teams that have adopted hex in sort of the first wave, you know, I guess for a while we were probably just like an early, we're probably past that now, but we were like a mostly an early adopter tool. are the teams that are most progressive.
Starting point is 00:24:01 But even now, I talk to companies and teams, I think people feel where the wind is blowing. And I think they're looking for the answer. And going back to the anxiety point, the talk I give internally is like building up this anxiety. You know, like this is what our feeling. And then it's like, our job is to resolve that through building a product that lets these people do their best work
Starting point is 00:24:22 and have the highest impact they can have in the company. Yep. And if change is going to happen, it can happen on their turn. in a way that they can grapple with and in a good way. And, you know, this is kind of getting heck-specific. But that is sort of the job I see we have. And I just zooming out a little bit, like, that's the affect I want for data people generally. Even, you know, people outside of Hex's own usage or success, like, that's what I want for all of us is, like, have this new definition of how we can be really impactful and great partners for the business and help solve those problems and move things forward.
Starting point is 00:24:57 in a world where we don't have exclusive monopoly on like writing SQL queries. Yep, right. Yep. We're going to take a quick break from the episode to talk about our sponsor, Rutterstack. Now, I could say a bunch of nice things as if I found a fancy new tool. But John has been implementing Rutterstack for over half a decade. John, you work with customer event data every day and you know how hard it can be to make sure that data is clean and then to stream it everywhere it needs to go. Yeah, Eric, as you know, customer data can get messy.
Starting point is 00:25:27 if you've ever seen a tag manager, you know how messy it can get. So Rutterstack has really been one of my team's secret weapons. We can collect and standardize data from anywhere, web, mobile, even server side, and then send it to our downstream tools. Now, rumor has it that you have implemented the longest running production instance of Rutterstack at six years in going. Yes, I can confirm that. And one of the reasons we picked Rutterstack was that it does not store the data and we can live stream data to our downstream tools. One of the things about the implementation that has been so common over all the years and with so many rudder stack customers is that it wasn't a wholesale replacement of your stack.
Starting point is 00:26:08 It fit right into your existing tool set. Yeah. And even with technical tools, Eric, things like Kafka or PubSub, but you don't have to have all that complicated customer data infrastructure. Well, if you need to stream clean customer data to your entire stack, including your data infrastructure tools, head over to rudderstack.com to learn more. Okay, let's change gears a little bit and actually talk about team structure a little bit, because we've been talking about teams, right?
Starting point is 00:26:37 We've been talking about the impact of teams. And you wrote a really interesting blog post recently about getting rid of the AI product team, which is really fascinating. So anyone listening, really interesting read. but Barry, this is a significant departure from what we talked about last time. So last time there was a dedicated AI team whose task was to both do a bunch of R&D type stuff
Starting point is 00:27:09 on how do we do this, what are the costs, what are the evals, we kind of talked about all of those pieces. And then also to actually deliver features. So, HexMagic, you know, is a set of AI features that can do all sorts of fascinating things. And that was a dedicated team who worked on that stuff, right? So the other product teams were separate. What changed and what does the team look like now? Yeah, so, right, and going back, it's interesting, you know, this is my third time being on.
Starting point is 00:27:42 So it's almost like these episodes of, you know, every year of how we think about this. Yeah, epiops. So two years ago, you know, early 2020. we launched our magic tools. So actually coming back from like holiday break at the end of 2022, Chad GPT had just launched. We had done a bunch of AI experiments before that.
Starting point is 00:28:00 Actually like three years ago, I guess, Hackweek, I like forced someone to use like an early version of the GPT3 API to like write SQL on it. And everyone was like, oh, this is cute. Like it was like a little thing. And then we did it like experiments. And then, you know, I think the chat GPT came out. Everyone sort of focused on it.
Starting point is 00:28:16 And, you know, it was very obvious to me. it was like, this is huge. And we're not exactly sure where this goes. We had to invest in it. So we built a first version of we launched these features. We called it Hex Magic. I didn't want to call like Hex AI. I thought that was, I don't know, magic was cooler.
Starting point is 00:28:30 It's a little less derivative. So we launched these things. Very wise decision looking back, by the way. Let's talk about this, right? Because Magic is a really cool name. And we had some sick swag we made. And, you know, we expanded the Magic team. We hired more people for it.
Starting point is 00:28:42 We had a PM. We had engineers. We had designers just focused on magic. And then I think in the early days of a new technology. You know, I actually think it was what, maybe I would have done a different with perfect hindsight, but I think it was broadly correct because it's this new technology that's hard to use and it's changing really quickly and like, but, and so it was like that for the better part of two years, about a year and a half. Coming into this year, it just felt
Starting point is 00:29:08 off and there were a bunch of problems with this. I mean, one, one, and maybe the biggest one was just like most people, the company, weren't doing AI stuff. They were doing other things. Because that was the AI team's problem, or not problem, the thing. Yeah. It meant that just the raw volume of stuff we were able to build around AI was constrained by like hiring onto that team and scaling that team, resource contention, honestly, with other teams. And I think like, it's just hard and you have a team that's organized that way is like, you know, people are prioritizing their own stuff. They've got their own goals. They've got their own sense of what their team is about. And most people were just not that. And then they would build the feature. So like we built, as an
Starting point is 00:29:45 example, last year, we built these awesome new Viz features. and sort of a whole BIA experience inside Hex called Explore. It's like you had a team building Explore and then another team that was like, well, how do we like remote control this? And like that feature is kind of changing under their feet. And like we're just obviously kind of doing like right hand, left hand kind of thing. And then I would say like finally just
Starting point is 00:30:08 it just felt structurally wrong when I looked back and like kind of studied history. And I thought a lot about like, do you guys remember a sketch? Oh, yeah. Yeah. Totally. Some of our listeners may not. So, Sketch was a really popular product design tool.
Starting point is 00:30:23 It was beautiful, full Mac. You know, it was like a really thick Mac app, like a really great Mac app. Yep. That ran locally. And, you know, it was like Figma before Figma in the sense. That was one designer. Product designers used. And like all the workflows around it, you know, were really beautiful and you were using it.
Starting point is 00:30:38 But then, like, all the sharing collaboration stuff really sucked. It was like you're exporting stuff to PDFs and like commentary. There were a whole cottage industry of other tools trying to make it collaborative. And I think a lot about that because they, wound up building a cloud team. Sketch had a cloud team. And I think probably in their minds 10 years ago, they're like, yeah, we're doing cloud.
Starting point is 00:30:55 We have a cloud team. Yeah. Figma never had a cloud team. Figma just was cloud. Like their cloud team, you know, the whole part has built assuming cloud. Yep. And I thought a lot about that.
Starting point is 00:31:06 And I was like, I think if you're going to thrive and embrace a platform change and these things, you have to, like, do that. So we got rid of magic. There's no more magic team. And in fact, the features, internally at Hex, we're like stripping magic out of everywhere. It's not magic does this.
Starting point is 00:31:22 It's Hex does this. Yeah. It's not like one little, these are bolted on AI features. It's like, no, the whole, we have to think of the whole product that way. And I was nervous writing that blog post, actually, because I think here in September, October, 2025, maybe this just now feels a little obvious. Like, I think it's, I was nervous that I would write it and a lot of people would be like, duh.
Starting point is 00:31:43 But two things. One, I've actually been surprised to hear from a lot of people, including a lot of founders that still have these dedicated. at AI teams. Oh, yeah. They're effectively dealing with the same objections that I heard a lot internally and that we grew up. And then two, it was super non-obvious at the beginning of the year.
Starting point is 00:31:59 And like, things are moving fast. And so we got rid of the magic team. There's an AI platform team that does stuff in the same way like a cloud platform team might do things, right? But the rest of the teams are building on those distractions that platform building and they're all, most of their roadmap now are AI features. And it's like, but there's also a lot of non-AI stuff there. it's up to them to just prioritize those things
Starting point is 00:32:20 within their limits of the users that they're trying to target. And if we're going to survive and thrive in the next era, it was very clear to me that deeply embrace becoming an AI company and AI first company to avoid that sort of innovator's dilemma trap. And so that's reflected in what we're building
Starting point is 00:32:35 and shipping and just how we think of the company and also just how we think of the whole space of all day. Can you give one or maybe more examples but of a feature or a product idea, that was born out of that change, where a team that wasn't previously working on AI, sort of maybe reimagine something that they were doing
Starting point is 00:32:57 because they now were given the, I can't think of the right word, not the license, but basically the charge to say, solve the problem in the best way, if AI is the best way to solve it, then great, it's an AI feature. I think our core flagship workflow has been our notebook products or surface, which is really like, you know, we can talk more about a product pitch at some point, but just think of it as like,
Starting point is 00:33:22 for people who haven't used to actually seen hacks, just think of it as like this amazing notebook to be able to do analytics and data science work in, and it's great. And, like, we had a team that was ostensibly focused on improvements to that. And then we had another team that was focused on AI stuff. And previously, previously. Yeah, yeah, previously.
Starting point is 00:33:39 We had that, yeah. And, like, it's just kind of obvious now that, like, you're not making an optimal prioritization. decision there because you have one team that's like, okay, we're going to improve this in a vanilla way, and then another team that's doing in a different way. And like, even as an example, like, we wanted to improve the way you can do reviews of changes in the notebook if someone else has been done. It's like, well, yeah, I mean, you could have to do that in the vanilla way. There was also then the team working on the AI features that were like, well, as the agent
Starting point is 00:34:06 edits things, you want to be able to review those changes. And like, we're not a huge company, right? Like, maybe it almost seems laughable that you would not have those things connect more. But no, no, people are their own standups and their own planning meetings and their own things. And it's like, it's just like very easy to wind up doing two different things. And so by having it just be one team, now it's called the editor team. And it's not the AI editor team. It's the editor team that owns the notebook surface. And they're prioritizing everything within that.
Starting point is 00:34:34 And if there's, the tension still exists between while there's this vanilla, you know, non-AI feature that we know would make users lives better, do we focus on that? Or we'd focus on this whiz-bang AI thing. It's very easy to get sucked into just doing AI stuff because it's so exciting. The same time, it's also, like, got incredible upside. I mean, even as you think about how do you make it easy to, I don't know, like, clean up, like, here's another example. We launched a feature called Sections earlier this year that was like from our vanilla notebook team. Yeah. It just makes it so you can organize cells within your notebook more easily.
Starting point is 00:35:08 We built that with no regard for how it would be used by AI. It turns out now that the predominant way people use sections is by asking our notebook agent to clean. things up into sections for them. And it may sound like a success story, but like we should have just launched sections with that. Or there's actually ways in which we even thought we could have built sections even better if we had done it with an eye toward AI from the beginning. Like when you add a new section, maybe it could just auto suggest cells to add to that.
Starting point is 00:35:32 It's like we just weren't thinking about it that way at all. Sure. So I use a long list of things like this where you have to just deeply assume AI in the same way that the companies that really succeeded in the cloud transition, like deeply assumed cloud. If anything, actually, it's kind of funny to think about it. It's like, Phigma actually struggles and like Notion and these other sort of cloud-first products
Starting point is 00:35:52 struggle to work offline. Like, their challenge almost cuts the other way. Right? They're so cloud-first. It's like, I don't know to do this, I'm just connected from the internet. And just to draw that comparison a little bit out a little more because I think it's interesting. I think my goal is almost to have heck struggle
Starting point is 00:36:08 if you don't have the AI features turn on. Now, it sounds like a weird thing to say. Like I'm trying to make my product. I'm not. But we just think the best way to accomplish a lot of these tasks is by partnering with an agent. And I actually have started to push our team a little bit on like, I don't know that we want to sign new customers up that aren't flipping the AI features on. Maybe that switch shouldn't even exist. It's just like it is. And in the short term, that would mean walking away from millions of dollars in revenue easily of customers who aren't ready to do that right
Starting point is 00:36:38 way. But something tells me that we will be happier and wind up building a better product if we can just assume that everyone is getting these AI features and there's no like non-AI mode. Well, we've talked about this before, John. I actually have a pretty strong conviction that and I think it sounds like hex is on this path where for a lot of really amazing product experiences, the AI just won't be explicit, right? Now, okay, for certain activities where the form factor makes a lot of sense, right, I'm doing an exploratory data analysis or, you know, those sorts of things, right, where you may have like an, you know, you may ask an agent to do some open-ended task, of course, right? And especially in a like field like data. But then there are also so many
Starting point is 00:37:28 things like you said where is it an explicit AI feature like if your sections are just organized or the user's just presented with that. No, it's actually just a great experience, right? It's just a really nice product experience. It's not that you have to click some separate magic button. Yeah, right? That's like, that's how hex works. Yeah, totally. You just get it. And I think there's a level of ambience. 100%. Ambience is a great term. Yep. In great products. Right. Right. Right. It just works that way, you know. And you saw a lot of companies. You can see this, these facets present themselves in different ways.
Starting point is 00:38:02 So I'll pick on Notion because they're a great company and we're good friends with them and they use us and we use them and so I'm not picking on them actually quite an admirer. But when they launched their
Starting point is 00:38:12 Notion AI features, it was Notion AI. It was a separate. They charged for us separately. A lot. A lot. And I, even as someone who's very AI built,
Starting point is 00:38:21 I was like, oh, there's a lot of money to pay for this. I agree. I agree. I won't name names when I asked someone very senior over there.
Starting point is 00:38:28 I was catching up with this, catching, to talk to him. I said, Why are you adding, why are you charging extra for this? Because I was actually getting this question from my board. Like, shouldn't we charge extra for AI?
Starting point is 00:38:37 I was like, no, that seems like such a short-term thing. I actually wrote a public blog post, why we aren't charging extra for AI. I was like, there's no Hex magic add-on. It's just how the product should work. And I actually wrote that post predating this team regard. I almost like, I think I had the right ideas before I had the courage to like structure the company around it. But the notion I add-on, they were like, well, we want it to be margin protective. And like, not everyone wants them right away.
Starting point is 00:39:01 and all this stuff. And now they've gotten rid of that add-on. They're just shipping it with the seat. I would guess that they'll probably have something like everyone and like we probably have at some point where we charge you on consumption if you're using like a ton of it because these things.
Starting point is 00:39:14 Sure, sure. But like it's just how the product works. They just announced if it was Notion 3.0 a couple of weeks ago. The agents, yeah. Agent stuff. That's the headline of Notion 3.0. It's not the notion AI 3.0. It's just how the product works.
Starting point is 00:39:26 This is what's so offensive to me about Slack. Slack has AI add-on features for so long. Yeah. It's like, what a legendary fumble. I mean, if anyone from Slack is listening, I'm really sorry to be beating up in your hair. I assume it's not your fault because you're like a division of Salesforce and I'm sure that's hard. But like, imagine shipping the world's most popular workplace chat app and not shipping an amazing AI chat experience. I mean, it's so unbelievable. It's like out of a API. Yeah, imagine like restricting your API now to make it harder for other people to do. Yeah. To you. Yeah. it actually sounds like we're brainstorming a type of Silicon Valley thing and again, sorry to beat up on Slack and I was like a very early adopter
Starting point is 00:40:07 I fought super hard to like get Slack AI you know through it, yeah, whatever but yeah like the chat app not just so ironic. It's amazing. It's really amazing and it's like you know Slack could have been gleaned. You know, it's like anyway I don't mean like beat up on it too much. I just think it's an example of like where you can kind of see products
Starting point is 00:40:24 where like the seams are showing where it's like there's Slack and then there's these Slack AI features and they're monetized separately and they're thought of separately. And like, I actually, you know, maybe this can be a great redemption arc for them. Give Slack AI away for free and just make it the way Slack works now. I don't know if we're actually, it's funny. I don't know if the company's big enough now, I lose track. I don't know if we're actually paying for it or not, but we have the Slack AI stuff on now. And like, I can click a button to have it summarize a thread. It's sick. It's great.
Starting point is 00:40:52 It's like, this is obviously just how this should work. So maybe they're on now. Totally. So I want to dig back in on the hex stuff. And I want to talk. specifically workflows. If I was using Hex two years ago, let's say, and I didn't have any AI features on, and now I'm using the current latest version of Hex all in on AI, what does my practical, like, daily life look like? How is it different? Amazing. You're in better shape, you're better looking.
Starting point is 00:41:21 Nice. You're wealthier. You're wealthier. Yeah. You're 4.1Ks. any GLP-1s with your subscription now not even meaning to tee up aside from all of those things right or even like you could even
Starting point is 00:41:34 contrast it like maybe there's a bigger contrast here of like and this is like my underlying theory behind this question is I think to two three years ago this was most people and then now there's still a lot of people that their workflow is like hey I kind of write SQL in Python and I kind of copy and paste it into chat GPT it like makes
Starting point is 00:41:50 it better or maybe it writes it for me than I copy and paste it in some other tool and then like I kind of get what I want and hit save saves me a little bit a time. I think there's still a number of people that work that way. That's not how, yeah, most people. So that's not how HECS work. I see our focus right now is building the magic of these AI agents into a product that has all these other amazing things you need to be able to do data analysis really well. So if you open the HECS notebook, today the notebook UI, you can get started with a prompt, and it is using the latest Claude model. Actually, by the time this is published, it'll be
Starting point is 00:42:23 public. Cloud 4.5 Sonnet, which we've been beta testing with Anthropic for the last little bit. They'll be out by then. You did not hear it here first. To go and do these really agentic tasks that now we, you know, what, so go search through my data, search through tables and semantic models, what are the most used, what are endorsed, what's been published by the data team. Okay, I see the user is asking about sales data. I can go search for a model for that. Let me run an exploratory query to look at the structure of it. Okay, cool. I see what's going on with this data. I see it only runs through August. but they're asking about September.
Starting point is 00:42:55 Let me go back to the user and ask. And so you have these highly agentic flows that can reason about things, that can build you cells, write SQL, Python cells, chart cells. It knows how Hex works and that it can wire these things together and edit them and look at the outputs.
Starting point is 00:43:10 And so in the first instance, it's just obviously way better than copying and pasting things back and forth between chatchets. Yeah, yeah. We have the same power of that like bleeding edge model built into the product directly. But second, it has context.
Starting point is 00:43:22 And I think this is like a really big deal. It knows about the other analyses you've been doing. For our sort of, we call it Threads, which is our self-serve version of this, which is a very conversational UI. You can get it through our UI. You can get it through Slack as maybe a self-serve user. You know, it's restricted to only use
Starting point is 00:43:40 endorsed semantic models. And it has access to all the projects that your data team has already published. So if there's a dashboard that, like, Eric has already published, that answers your question, it can say, well, hey, there's this existing thing. And obviously that's different than shuttling back and forth between chat GPTRA uploading CSVs and just a really perfect way in which it knows about you and your team and the data governance and context that you provided. And we think that there is a huge opportunity here around having what we think of as this virtuous cycle, which is you have edit, you know, these sort of, we think of this editor persona like the data scientist, data analyst who's working on the new, the novel, and the gnarly.
Starting point is 00:44:20 like I'm looking at new questions that haven't don't have a canonized answer this is a frontier thing this isn't a deeper dive this is something that you know some of the business shouldn't just be like free you know free soloing you know I want to go and like really drive our thinking on this take the output of that and canonize it as context yeah you call it like a canonized curated context I think I have a fifth see I can't remember what it is yeah it's context that compounds where maybe it's a data app on publishing a dashboard our lives and our knowledge base. Maybe it's a semantic model
Starting point is 00:44:52 that I can create from that notebook. So we now have the ability to literally like at tag a notebook and be like building me a semantic model from this. Cool. We can make that easy, right? And then the rest of the org can self-serve based on those endorsed assets,
Starting point is 00:45:06 models and apps, ask the own questions, and where they hit the wall, where there's not data for that or there's the agent can't answer it with confidence or whatever. They need help. They can tap in the data team
Starting point is 00:45:17 and the cycle sort of like returns a new. And we think of this as this virtuous cycle that's sort of compounds. And we're thinking at all three sides of this, the sort of new novel, gnarly notebook workflows, the curated compounding context, you can tell I like alliteration, right? Workflows and the sanctioned self-serve workflows. We're thinking all three of these, how can like what is really cool about AI agents make these way better? And that is, again, that's the primary focus. And by the way, talking about the team org stuff, those are our three product teams right now. We have the editor team, the curator team, and the explorer team. Those are the three people we care about.
Starting point is 00:45:55 And then there's the AI platform team. And there are other platform teams that sit underneath. Yep. How we have the company, the main part of the engineering organization organized, and how we think about what we're building and how we think about what we're bringing to market. And going back to the question of like, you know, how is this better or different than maybe like just copying, pasting things we can call it? It's like the integration, I think really matters and having a set of tools where we can come to a data team. And going back to the first topic, he said, this is the solution to this point of tension you're failing. This is how you can have AI accelerate the data workflows in the Oregon way that maintains trust and candidly
Starting point is 00:46:31 maintains your relevance and pertinence to that. I think that's, we really believe in this. We think that's where the puck is going. And I'm sure other people will try to build that too. Our focus is just trying to build the best thing. Well, I mean, also under the hood, the other thing is people intuitively understand as they use these tools that adding more context is helpful, right? But if you're just doing this on your own and pasting stuff, managing context windows and tokens and all that sort
Starting point is 00:46:56 of stuff, is it really annoying. And so the fact that Hex just does that for you, you know, like that the platform team is handling all of that and is intelligent about... Yeah, one interesting thing that, you know, this connects back to
Starting point is 00:47:12 even somewhere we were talking about last year. We've been thinking a lot about it. We've learned a lot about AI engineering. And I was going back and looking at last year's episode, and I was almost grimacing a little bit because I was like, man, some of the stuff that we thought was really bleeding edge then is already outdated.
Starting point is 00:47:25 Things are moving so fast, right? Yeah. The L.LM is a judge thing. Like, we were like pretty cutting edge of that. Now that's a common thing, right? But coming back to that, like, that's a technique we leverage internally is we're building our own product.
Starting point is 00:47:37 And there's a workflow that we do internally, which is like either through internal usage or external users, hey, the agent gave me an answer that wasn't quite what I expected. okay we go and look into it we've built a lot of our own tooling to go lift the hood and say
Starting point is 00:47:52 okay well here's the context it was given here's the user prompt here's the tools it called here's the turns it took reasoning about this and here's why it got a good or bad answer right and we track these over time and we have our own evals
Starting point is 00:48:05 and our own alum as a judge things that's like a thing we do building our product I'm actually really interested in like how do you help data teams do the effectively the same job for the product with their users. Like, hey, Eric got a bad answer. You know, he just reported that the answer he got to this data question wasn't quite right. Okay, well, what happened? What context did it pull?
Starting point is 00:48:26 Oh, it pulled that model, but that didn't have a measure for this thing. Okay, let me go, let's go add that. Can we even suggest things to add? Or we help give you observability and analytics on how are these things working? That's interesting to me. It's such an interesting problem because it's like, okay, so we freed up all these data resources that used to do X. Like, here's this new problem of essentially like, you know, evals and understanding these AI agents. Like, who could we deploy on that problem? You know, what's funny is I think people are trying to, you see these like attempts to invent new job titles for things. Right. Internally, like, we just are like, yeah, this is analytics engineering now. Like, analytics engineering has kind of always been context engineering.
Starting point is 00:49:07 Yep. Mm-hmm. If you think of it that way and it's like, this is the next step. And what's funny is we have Claire Carroll, who invented the term analytics engineer, she's a DBT, works her, and she makes this joke all the time. She's like, analytics engineering is about context? Like, always was. It's like the astronaut. Yeah, yeah, totally. Always has been or whatever.
Starting point is 00:49:25 And I just think of it as like, great, how do we help you people who maybe already literally have that job title, analytics engineer about this new set of jobs to be done or this next sort of extension of their role? Yeah. And how do we think about data scientists and data analysts thinking about their place
Starting point is 00:49:41 in that cycle as well. That's interesting to us. And we're working on some stuff there. We've got some stuff that we're just starting to dream about. But when I think about the jobs to be done for data teams, I think there's the new novel gnarly stuff that I think you continue to want to people who are very intimate with the data and intimate with statistical techniques and things that even if an AI agent is helping them do it,
Starting point is 00:50:03 you want them driving it and you want them in partnering. I think that self-serve is going to just be much more conversational and ubiquitous. And I'm really excited about that. I think the breadth is going to be really broad, a really super expanded, whatever success anyone had in self-served before will be dwarfed by how easy it will be for people to go and ask and answer data questions with natural language.
Starting point is 00:50:25 And then I think about underneath all of that, you have this curation and context layer that you still are going to want people close to and thinking about and managing. And in some ways, that's the same as it always was. That's the same analytics engineering, man. Yeah. There's a whole new set of things.
Starting point is 00:50:42 And that's what we're excited. One other thing on this subject, I think this is exactly where we started the conversation around the anxiety piece. Like in this future, and a lot of things could change, but in this future, you, A, have a whole new, like, area for data people to work in, like on evals and with, you know,
Starting point is 00:51:03 agentic workflows and stuff. And then B, you also, assuming this, and I think this, is a safe assumption. The second round of this self-service stuff is way more effective than the first round. Because there's a massive gap of what people wanted was self-serve and what actually happened, right? So if you get a way more effective... That's an understatement. But if you get a way more effective deployed the second time, then the data, like, the demand goes way up too. So you actually have two axes where the demand is going to go way up here, and then one where it goes
Starting point is 00:51:35 down. So, like, I think for data people, that have really good news. I think it's tremendous Again, there's an infinite demand for insight. And if you like, fast forward the tape a little bit, and you're like, okay, I'll be selfish and sort of commercial for a moment and be like, I'll pitch hacks, right? You fully adopted all the things we have to build. You have people using threads to answer self-serve questions. And they're just, they're asking answer a ton of them, right? They're super high usage and people are just plugging that in. And maybe they're doing it from Slack. Maybe they're doing it on the XUI. Maybe you're doing it via other tools. It's like, you're going to have a lot of people that are raising their hands of like, Like, hey, can you help me look at this in more depth? Or, hey, I'm not sure this is a way to look at this. Or, hey, I could really use some additional data on this other thing. I've calculated a measure that I'd like to be able to use more across all the ways we're measuring this. Like, holy shit, there's just going to be a ton to do. And I don't know if you guys have heard, I don't think we talked about the last time, but you heard of Jevin's Paradox.
Starting point is 00:52:32 That came up recently. I think it were. Yeah. We talked about this. Yeah. Yeah. I wrote a blog post about it before sexy. I actually in turn ripped it off of my friend Miles who told me about it. But the Jevin's paradox, the short version of it is like economists, I think William Jevins in the 1860s was noting that as steam engines, like coal burning steam engines were getting more efficient, net aggregate coal consumption was going up. To say that almost like, almost you have to like think for a second day. Why did that? Yeah. Right. But like you go back and you're like, well,
Starting point is 00:53:06 these steam engines are getting more efficient, but people are still burning more coal. And the answer is, like, we were just finding a lot more things to do with steam power. Turns out to be able to generate a watt is like a good and useful thing. Yeah, yeah, yeah. Like, we are now talking live over a video chat on the internet with computers. Like we, yeah, that wasn't, no one to really imagine that. Turns out there's a lot of cool things you can do with watts. Yeah. I think of this is other stuff going back to the topic we were talking about software engineering earlier.
Starting point is 00:53:34 It's like, as it becomes cheaper to develop software, do you wind up with less software builders, or do you just wind up with a lot more software, or better software? As it becomes cheaper to do data work. Hopefully better, but definitely more. We're definitely in the quantity over quality phase in some areas right now. But, like, I think it's interesting to think about these. And it's called economics is an example. It's called a rebound effect, which is, and so I think about that, and I think about data work.
Starting point is 00:54:01 And I think if I was on a data team, as I talk to people on data teams, my advice is lean in and embrace it. And there's going to be a whole next set of things. And actually, I think in some ways, the demand for data roles, there's a world I imagine I can picture very easily, which is that goes up. You have organizations actually want to hire more data people because they're just so much more valuable. It's not like they take two weeks to give me some answer. It's like you can actually feel the impact of the teams in a higher NPS way. I agree with that and I think even if
Starting point is 00:54:33 it takes a lot of companies longer to figure that out they will begin to notice the companies that adopt that mindset of insight abundance are going to move way faster, right? I mean, if you think about like you said, infinite demand for insights
Starting point is 00:54:52 as you start to meet that demand ultimately the goal for that in a company is to materialize into some sort of competitive advantage or success for the company, right? Right. So I think ultimately that the force will be in that direction. But there is some lag time, especially the people of the industry.
Starting point is 00:55:09 Totally. Yeah. Maybe, and you're going to see variants, and we will hear stories about teams, companies that lay off their data teams or shrink them because they're not doing that'll happen. We'll also hear about, just as we're hearing stories about people who are laying off software engineers or whatever. Right. Also hear stories about a lot of other people who are figuring this out, and there'll be creative destruction. and figure it out going forward. Like one interesting correlator for me,
Starting point is 00:55:32 and maybe this gets into some other topics we can impact, it's like now that so many more people are going to have access to the data quality and data engineering is going to become, I think it's going to be super easy to justify a lot of teams and spend on that. Yeah, totally. It used to be this kind of invisible under the hood thing, but now the CEO is asking a data tool, the question is getting a wrong answer
Starting point is 00:55:56 because the underlying data was. Why the hell don't we have more people doing this? Right. A little bit of budget for data tools. The governance is a priority. That's the exact persona too, right? That's the exact person that like from companies that I've worked with, you know, accounting finance teams like thin-op stuff.
Starting point is 00:56:15 Like they often have their own workflows and they're often most resistant, you know, to some of these changes. And I would say if I picked up one team that never really participated much, at least across my experience and the like self-serve thing, it was that team. If I had to pick one team that didn't, it was that team. And if they get wrapped up more into this next wave, because legitimately it's better, then I think you're totally right. Yeah, I'm long.
Starting point is 00:56:38 This is outside of hexas Ken, but it's upstream from us, so I care about it. It's like, I am long ETL tools. I think you're just going to have actually like a cascading demand for that. Yeah, yeah, totally. Well, okay, speaking of ETL tools, I think we're at the buzzer. Is that right, Brooks? Are we close to the end?
Starting point is 00:56:56 Five minutes? Okay, let's flip this here. Yeah, a little five-minute teaser. Okay, five-minute teaser, but Barry, I have to ask you, can you come back on in not another year, but, like, maybe a week? Because I want to do a quick hot take on industry stuff, but then have you come back on sooner so we can dig deep because... I'd love to do that. It could be interesting to...
Starting point is 00:57:16 I assume we're going to cut this conversation. It could also be interesting. Like, I don't know. I'd have to think about who else, but it could be fun to have maybe a couple guests just come on and just talk about... Totally. Talk about industry stuff. Okay, so as a preview for... that. So we're just going to, we're just going to
Starting point is 00:57:30 will this into existence by having you give us a preview, hot take. Number of major things shifting in the industry. I mean, Hex made an acquisition, but then if you look at 5Tran, you know, a number of acquisitions at multiple different points, right? So, I mean, what's really interesting there is
Starting point is 00:57:47 you have sort of ingestion, like ingestion pipeline acquisition, pipeline acquisition for getting data out. They bought census, modeling with SQL mesh. and then now DBT. Rumor. Rumor. Rumor has it.
Starting point is 00:58:03 Rumor has it. Maybe it won't be by the time this airs, but as of this, it's Monday, it's Monday, September 29th. It's rumor. Database acquisitions, right? So,
Starting point is 00:58:17 crunchy data. Crunchy data. Yep. And there's more coming too that I know about that aren't public yet. All right. So give us the quick take.
Starting point is 00:58:27 Well, there's AI stories around some of these. I think there's also just like a secular thing that's probably the biggest coefficient in it, which is there was a Cambrian explosion of data tools around the modern data stack, 2020, 2020, a lot of funding rounds, us and a lot of others. And like, you're just natural, I think it's just like a natural contraction. You have winners and losers sorted out. Yep. You have players consolidate and want to be able to offer more to customer.
Starting point is 00:58:57 in one place. And customers, I think, want that too, right? Yeah, I mean, I think like 5chan, and I'm a huge fan of 5chan and George, and I'm a huge fan of, I know people at all the companies they've acquired and I'm excited for all of them because I think there's a great opportunity for 5chan to go to a customer and be like, hey, we're the all in one place to come and do, think about data movement, where it's going and how it's being used. That's awesome.
Starting point is 00:59:23 From Ingest to with the stuff that they've done for a while. on the HVR exhibition a while ago. Census to where it's going, how it's modeled in between. Yep. It wouldn't surprise me to see them do stuff around quality and governance. 100%. I also think that there's an interesting thing to look at, which is that there's always a bigger fish.
Starting point is 00:59:44 The underlying data platforms like Snowflake and Databricks are also trying to do more. Yeah. You mentioned about database tools, but they're also every, you can pick any corner of the data stack, and they also are trying to build their own things to try to compete. Pete, quote unquote, with us. We're also great partners with them. Totally.
Starting point is 00:59:58 There's a lot of co-operatition. But if you are an independent player, I think you also need to be thinking about how you are doing more and you are something where a customer can buy you not as a one-off point solution, but as a broader suite. And it's true for 5-Tran, right? Like Snowflake and DBT, or Snowflake and, excuse me, Databricks and not to mention Google and others,
Starting point is 01:00:18 all have built like good extraction tools themselves that in theory. If you want to replicate Postgres to Snowflake, it's like, what's the cheapest way to, to do that. I don't think as of this recording, it's buying 5Tran is a separate thing. Now, right. I think 5Tran, George, and the 5Tron chain are incredibly brilliant people. And they're, they kind of look at that and they're like, great, well, that's not how we win. We're going to win by going to be the best thing for customers to be able to buy something that's going to manage all of those things across the lifecycle of data. I think if you're trying to compete
Starting point is 01:00:48 by just like, we're a great way to replicate Postgres to Snowflake. It's just like, that's going to get commoditized over time. Sure. Yeah. And we think about this a little bit ourselves. We acquired hashboard earlier this year. Most of that was the talent. I mean, I know Carlos, I'm always surprised Carlos as CEO, how many people in the data world he is known and charmed. He's an amazing guy and he had built an amazing team and we felt a lot of kinship with them. They'd also built a really nice self-serve BI tool that we looked at and we said, hey, you know, our ultimate vision at hex is much broader than just being notebooks or data science over the last couple years. We've gotten much more assertive about that. Bringing on people who can
Starting point is 01:01:24 bring a lot of expertise to that is great. And I think you will see it continued. I think there's going to be more and more of these as the world changes. And there's AI angles to some of them. Like I think this, as I mentioned earlier, like I think there's a secular trend where ETL is going to be and all the aspects of data quality are going to be more valuable because it's being consumed by ATA tools by more people and it's just going to get a more pull on it. But I think most of this is probably just like the natural Darwinian sort of whittling aggregation that happens as industry waves occur. We'll see the same thing with a lot of AI stuff outside of data.
Starting point is 01:02:02 We're already starting to see it. There's already a lot of companies that got funding a couple years ago for a first gen of AI tools that are able, they're not obviously breaking out, and they're looking for homes. Yep. Some of those ourselves. And so it's just natural cycles. Yep.
Starting point is 01:02:17 I love it. All right. Well, we'll message and get the right people on the show in the next week or two to do a deep dive on the industry. Maybe we can have a panel. But Barry, it is always a pleasure to have you on the show. The time flies by
Starting point is 01:02:29 and we learn so much. Yeah, this is great. I always love chatting with you guys. It's an exciting time. I'm happy to come back on when it. Awesome. We'll do it. The Datastack show is brought to you
Starting point is 01:02:39 by Rudderstack. Learn more at rudderstack.com. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.