The Data Stack Show - The PRQL: Year-End Reflections with The Cynical Data Guy: Ship Features Fast, Pay Later

Episode Date: January 6, 2025

The Data Stack Show is a weekly podcast powered by RudderStack, the CDP for developers. Each week we’ll talk to data engineers, analysts, and data scientists about their experience around building a...nd maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com.

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to the Data Stack Show prequel. This is a short bonus episode where we preview the upcoming show. You'll get to meet our guest and hear about the topics we're going to cover. If they're interesting to you, you can catch the full-length show when it drops on Wednesday. Welcome back to the Data Stack Show. We're here for a very special show, a Christmas edition of the Cynical Data Guy. So I'm here with Matt. Matt, welcome back to the show.
Starting point is 00:00:30 Welcome back. I'm here to bring the Festivus to your Christmas cheer. Excellent. Yeah, unfortunately, Eric can't be with us today. So you're stuck with myself and Matt. But we've got some fun topics. We are going to start out the show, which is a record for us, by the way. We're going to start out the show talking about AI.
Starting point is 00:00:49 It's unusual. It's Matt's favorite topic. Oh, so much. All right. So one of the things that I've seen a lot, Matt, I think you've seen it as well, is we keep seeing these various forms. And OpenAI calls it the 12 days of ship mess. And there's basically, there's several different software companies doing this, where they're attempting to brand,
Starting point is 00:01:13 pushing really hard at the end of the year, essentially, to get product out. What do you think about that, Matt? It sounds like someone made promises for the end of the year, and they have not kept up with them at this point then some people that are like crap we promised investors we're gonna do something we haven't done it yeah and it's such it's in such a juxtaposition to like the other the what i would be used to i think what you're used to too is like we're gonna do like we're gonna do code freezes so we don't break anything while people are on vacation
Starting point is 00:01:45 and things are very stable at the end of the year. So how do you think this is going to work out? I mean, I think it partially just garners attention is one of the big things.
Starting point is 00:01:58 But the unfortunate thing is once one person does it and they get attention for it, then everybody's going to try to do it next year and it just becomes the new standard and nobody really gets any credit for it. Right.
Starting point is 00:02:10 It just becomes, oh, now we have to ship 12 features every December. Right. Why are we doing this? So here's an article. So OpenAI, let's see, their 12 days of shipments, they shipped the O 01 reasoning model, which if you'd like to spend $200 a month on chat GPT, you can.
Starting point is 00:02:31 I will pass. What else did they ship? Apple intelligence was day 5 with chat GPT. My iPhone isn't good enough for that. It won't support it. That's too bad. And a couple of other features day 4. Oh, and then Sorum was the other big it. That's too bad. And a couple of other features, day four. Oh, and then Sora was the other big one.
Starting point is 00:02:47 That's their, it can create video. I still don't have a use case for that in my personal life or professionally at this point. So good for them. Not something I'm going to be taking advantage of until I've seen it. Yeah, that's fair. All right, that's a wrap for the prequel. The full-length episode will drop Wednesday morning.
Starting point is 00:03:08 Subscribe now so you don't miss it.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.