The Data Stack Show - The PRQL: From Code to No-Code: How AI is Reshaping Data Work with the Cynical Data Guy
Episode Date: March 24, 2025The Data Stack Show is a weekly podcast powered by RudderStack, the CDP for developers. Each week we’ll talk to data engineers, analysts, and data scientists about their experience around building a...nd maintaining data infrastructure, delivering data and data products, and driving better outcomes across their businesses with data.RudderStack helps businesses make the most out of their customer data while ensuring data privacy and security. To learn more about RudderStack visit rudderstack.com.
Transcript
Discussion (0)
Welcome to the Data Stack Show prequel.
This is a short bonus episode where we preview the upcoming show.
You'll get to meet our guests and hear about the topics we're going to cover.
If they're interesting to you,
you can catch the full-length show when it drops on Wednesday.
Welcome back to the Data Stack Show.
We have our favorite monthly installment where we go deep into the bowels of corporate data America
and get some hot takes from your favorite cynical data guy.
Matt, welcome back.
Yeah, I'm back.
Nice to meet you.
Okay, this is going to be,
we're just gonna talk about AI the entire episode.
I was told there would be no AI in the episode.
Is that different than other episodes?
Yes.
I was told we would have AI enabled features
by the end of the quarter.
For the podcast.
For the podcast.
We don't have to do all this work.
Yes.
The AI just does the talking.
I mean, interestingly enough, actually, all
the transcription stuff and whatever,
it's actually been AI for a long time.
Well, there's a startup,
did you see there's a startup competing with 11 labs
that allegedly can take like a 15 second voice sample now
and we can generate like essentially.
I tried this recently.
Yeah?
I tried this recently, actually.
I generated a bunch of research using deep research,
which was outstanding by the way.
It was really, yeah, it was really good.
The ergonomics are a little bit weird
because it's so much text,
which is like the point in the chat,
just it gets really unruly around that.
But the content was really amazing. And it was to the point in the chat just gets really unruly around that. But the content was really amazing.
It was to the point where I, well,
first of all, in the mobile app,
you can have it read you a response.
But I tried that and it's really janky because it's such long text.
So it would have buffering issues, all that sort of thing.
So I was like, okay, I'm just going to go turn this into
a recording so that I can listen to it. Right. And there's tons of AI tools out there, which
we're going to talk about the tool, the types of tools that there's a service out there
where you can just upload a trick. You can actually generate an MP3 like with GBT or
whatever. I need to try a couple other models, but the voice and audio is just, it's a robot.
It's like, again, I'm not going gonna listen to 30 minutes of deep research with this.
Unless you have trouble falling asleep at night.
Unless I'm having trouble falling asleep at night.
Exactly.
But I think it's 11 labs,
so that student like audio books and-
Okay, maybe I need to check that out.
I have translations.
One of the major podcasts did like four or five translations.
Yeah.
Of like they did in English, but then they also like-
Yeah.
I had four or five other languages in a brand.
So is this going to be where we're just going to have all of our audio books are
going to be in like the same four people?
Well, it's going to sell the rights to their voice.
That was interesting. I just signed up for some tools and I mean,
I actually hit the limit on the free tool and I was just trying to figure out if
there was a quick way to do it. And anyways, there's these services out there.
And I literally recorded like 10 seconds of
my voice and had it like read it and it was a sound.
Brilliant.
So like scary.
Yeah, it was pretty wild.
The next thing you should try is where you purposely like pitch your voice really weird.
Yes.
And see what that turns into.
Yeah.
Yeah.
Although the interesting thing was I was like, wow, that did a really good job.
And I showed my wife and she's like,
that doesn't really sound like you.
No, not at all.
Interesting.
Anyways, the first topic I wanted to hit actually is not as spicy LinkedIn posts,
but we were chatting before the show and those transcription services,
there's so many of those that are just going to get completely wiped out
by the foundational models themselves. And we're starting to see that. I mean, doing
a bunch of tests even internally with some AI tools that we're using, the foundational
models are just now beating them like with a generic, like just completely vanilla, right?
Even for things that have been purpose trained on documentation for technical questions, like it's just really way better, which is wild. So my so this is me reading from my own. This is my internal LinkedIn feed.
If I was going to post on LinkedIn, this is what I would put behind the scenes.
Yes, is that I think just on a weekly basis,
we're going to see failures of these companies that were doing
something that was truly value
out because of the limitation of the model and now it's not anymore.
And so I think we're at the tip of the first wave of failures.
But say you cynical data guy.
Well, I think the thing is probably interesting about that was if you go to the beginning
when all this happened, when we started seeing all these AI companies pop up, the assumption
was the people who were just simple wrappers around open AI, they're going to be gone in
four years.
Yep.
And now, and the ones that we thought were actually adding value were the ones that were
going to hang around. And now with all these foundational models out it's like, oh well all
I really want is a platform that's a wrapper that I can just choose. And they're getting better.
I don't really need your specially built one for this purpose. Yeah. Yeah. That is such a
good observation. Yeah. Yeah. Well and the that is such a good observation. Yeah. Yeah.
Well, and the interesting thing about that, too, and I've seen this in a lot of platforms,
is if you build your wrapper toward whatever industry or whatever, like, so use case, and
then you have that ability to hot swap in models, there's this perception, I think,
of like, well, I don't know which one to
take. I don't know which one's best. It's like, oh, well, these guys solved my problem. And they've
got five options. And like, as the weeks go by and one model's bad as another, I can just flip it.
Yeah. I think there's kind of a like comfort in that. Like, oh, I didn't accidentally like
pick the wrong one. That might not be the best. Does it also give the illusion that you're not in vendor lock-in?
Oh, yeah, a little bit. Oh, I'm not going to be locked into a vendor.
Yeah, well, you are. You're just the sub-vendor is changing.
Yeah, yeah. Yeah. That's a really interesting point. I think one of the most amazing, actually,
I would say just in terms of the interface, but also the company that I think has done maybe one
of the best jobs of all of them of incorporating AI into their product is Raycast, which if
you use a Mac, you know, their spotlight. So you do command space and it pulls up like
the global circuit.
This would be long term Mac users. This is the new Alfred. This is the new Alfred, exactly. And A, it's just an outstanding tool standalone.
It makes Alfred look like...
It makes Alfred feel so primitive.
But it is actually an interface with all of the AI models, one.
So, all of them.
You can do all this custom configuration
for various commands to use different models
and all that sort of stuff.
But they're now using extensions to integrate it
at the operating system level so that you can do,
you can do all sorts of stuff, right?
So to run it against basically in your day-to-day workflow.
And so it's pretty wild to see.
Which is essentially connecting an LLM and they already have the OS level,
and it's doing action.
Exactly. Yeah.
Interesting.
That's really wild.
So many people are going to wipe out their computer.
Yeah.
R is.
Oh, yeah.
RL.
I said to me the thing, what did you do?
I didn't do anything.
Yeah.
I was talking with the LLM,
then it closed me out.
Yeah. I still remember there's a junior developers,
was probably 10 years ago that started in two weeks.
He had switched, he started out,
and he was using Windows to start.
Anyway, he had switched to Mac somewhat recently,
was at Asperger's command line.
He didn't delete all his files,
but he's having a managed to take every single file
from like all of the like separate directors
in the computer and put them all in one directory,
which is also just about as kind of stopping.
Wow, including system files, not just like more documents.
Please tell me they were all on his desktop, please.
I don't know.
That would be amazing.
But with one of those things, still learning.
Yeah, great person, good developer,
but just still learning terminal and like,
and then they just all end up in one directory.
That's also one thing.
All the sub-directors.
Look at them and you go,
I don't even know how you can do that.
Yeah, and there's not really an undo button from that.
Mm-hmm.
Okay, so look out for
companies to short because it's getting spicy. It's getting spicy out there as
the models become better and better. Anything else I do have a couple of
great LinkedIn posts that I do want to get to. Alright, that's a wrap for the
prequel. The full-length episode will drop Wednesday morning. Subscribe now so
you don't miss it.