The Changelog: Software Development, Open Source - The state of homelab tech (2025) (Friends)
Episode Date: February 7, 2025Techno Tim joins Adam to catch up on the state of Homelab for 2025, the state of AI at home and on-prem (AI Homelab) and where that's heading, building a creator PC, choosing the parts for your build,... GPU availability, Windows being user hostile, and why Tim is happy to be using Windows, Mac AND Linux.
Transcript
Discussion (0)
Welcome to ChangeLog and Friends, the weekly talk show about AI Homelab.
Big thank you to our friends and our partners at Fly.io, the public cloud built for developers who ship.
That's you. That's me. That's us.
Over 3 million apps have launched on Fly, and you can, too.
Learn more at Fly.io.
Okay, let's home lab.
Well, friends, before the show, I'm here with my good friend, David Shue, over at Retool.
Now, David, I've known about Retool for a very long time.
You've been working with us for many, many years.
And speaking of many, many years, Brex is one of your oldest customers.
You've been in business almost seven years.
I think they've been a customer of yours for almost all those seven years, to my knowledge.
But share the story.
What do you do for Brex?
How does Brex leverage Retool?
And why have they stayed with you all these years?
So what's really interesting about Brex
is that they are a extremely operational heavy company.
And so for them,
the quality of the internal tools is so important
because you can imagine they have to deal with fraud,
they have to deal with underwriting,
they have to deal with so many problems, basically.
They have a giant team internally, basically just using internal
tools day in and day out. And so they have a very high bar for internal tools. And when they first
started, we were in the same YC batch, actually. We were both at Winter 17. And they were, yeah,
I think maybe customer number five or something like that for us. I think DoorDash was a little
bit before them, but they were pretty early. And the problem they had was they had so many internal tools
they needed to go and build,
but not enough time or engineers to go build all of them.
And even if they did have the time or engineers,
they wanted their engineers focused on building external phishing software
because that is what would drive the business forward.
Brex mobile app, for example, is awesome.
The Brex website, for example, is awesome.
The Brex expense flow, all really, you know,
really great external phishing software. So they wanted their engineers focused on that as opposed
to building internal CRUD UIs. And so that's why they came to us. And it was honestly a wonderful
partnership. It has been for seven, eight years now. Today, I think Brex has probably around a
thousand Retool apps they use in production, I want to say every week, which is awesome.
And their whole business effectively runs now on Retool.
And we are so, so privileged to be a part of their journey.
And to me, I think what's really cool about all this
is that we've managed to allow them to move so fast.
So whether it's launching new product lines,
whether it's responding to customers faster,
whatever it is, if they need an app for that,
they can get an app for it in a day,
which is a lot better than, you know,
in six months or a year, for example,
having to schlep through spreadsheets, et cetera.
So I'm really, really proud of our partnership with Brex.
Okay, Retool is the best way to build,
maintain, and deploy internal software,
seamlessly connect to databases,
build with elegant components,
and customize with code, accelerate mundane tasks, and free up time for the work So, Tim, no breakfast. You're not a breakfast guy? No, no i i don't know why i just i just
stopped eating breakfast a while ago there's no reason for it there's just just don't do it
no no health reasons no health reasons no optimizations no biohacking no i mean you
know it kind of slows me down i think it i think I think it goes back to like, you know, I don't know, high school, not having enough time in the morning and, you know, and, and same with college,
just rushing the class. And so I just never picked up anything in eight along the way.
Gotcha. So you must be young enough to the point where you still reference high school and college
because I'm so far away from those two things. I'm far away. I'm far away. I'm just, I'm far
away too. I'm just trying to figure out like how it, how it started. And so I rarely ever do unless someone's
like, I don't know, talking about their kids. And I'm like, Oh yeah, I remember those days.
So you're not an intermittent fast or you don't intermittent fast or do your sort of eating
windows or do you practice a special diet of any sort? I mean, I intermittent fast, but not knowingly.
I've been doing it like half my life before it was a thing, you know, because I don't
eat breakfast.
I usually skip lunch and then I just eat dinner.
And so it kind of started happening this way a long time ago because on weekends I would
do it right on weekends.
I would just be so focused
on on whatever I was working on home lab gaming world of warcraft you name it that I would just
say ah I can make it to dinner you know and okay so I used to do that on the weekends and then
ever since like work from home it kind of carried over so right yeah so it's just dinner for me
a big dinner what do you do for I guess if you're not eating anything, are you just drinking water?
I am, yeah.
Just water only?
Water only.
No coffee?
Coffee, yeah.
Okay.
Yeah, yeah.
So wake up, two cups of coffee, Nalgene of water, and then probably two or three more of these throughout the day.
Okay. Then it's dinnertime. And then probably one or two more of these.
Wow. I mean, every now and then I grab a handful of nuts. Like I just did right before we started.
So yeah. Yeah. Right on. Nuts are good. Okay. So you're, you, you're a healthy eater then it seems.
I mean, I guess so.
Nuts are your snack versus a candy bar or, you know, let's say some gummies or something like that.
I don't know.
Don't get me wrong.
Like if we had candy in the house, it'd be gone.
Actually, when I went down to the cupboard just a second ago, my wife had chocolate covered almonds in there.
I'm like, what is going on?
Like if I knew about these, they'd be half gone by now.
Oh, man.
Yes.
I am a sucker for chocolate covered almonds or pecans. Yeah. I prefer pecans because almonds, you know, they can crack your teeth, you know. Yes. I am a sucker for chocolate-covered almonds or pecans.
Yeah.
I prefer pecans because almonds, you know, they can crack your teeth, you know.
Yeah.
Unless they're roasted and they're like a little soft.
I just had chocolate-covered pecans.
No cashews.
Chocolate-covered cashews with sea salt.
The first time ever last week.
And I was like, these are so great.
But, yeah, it's – I got a little bit of a catch with sea salt the first time ever last week. And I was like, these are so great. But yeah, it's I got a thing for candy.
That's I definitely have a sweet tooth.
Like, that's my wife.
She's like, oh, my gosh.
Oh, yeah.
She has to take it away from me.
Let me make a recommendation on pecans just in case.
Now, do you say pecan or you say pecan?
Pecans.
I see.
I do, but I don't enunciate it like you do.
So I say pecan.
Pecan.
Okay.
Yeah, yeah, yeah.
So is it pecan pie?
Yeah, pecan pie.
Yeah, no.
It's not pecan pie?
Because some people say pecan pie, and those are not in Texas.
No, pecan pie.
I live here in Texas.
But I'm a Yankee, as they say.
I'm from Pennsylvania.
And so, you know, I never really cared how you said pecan, but Texans really care.
And so you can't say pecan pie.
You have to say pecan pie.
Wow.
I thought pecan would have been a Southern way of saying it.
It's Southern, but it's not Texan.
Oh, gotcha.
So Texas is South, but it's not Southern.
That's right.
Southern is kind of like, you know, Louisiana and East and not Florida. So basically Louisiana, Alabama, Georgia,
the Carolinas, you know, that's, that's what is considered the South Kentucky. Of course,
Tennessee, those are the Southern States. Texas is Southern, but not quite Southern
in that regard, at least. No, it's funny. You mentioned that because like Indiana, I feel like is like half Southern.
So I'm from Indiana originally.
Right.
I'm from the Northwest near Chicago, but Indiana, you know, is, is, you know, this long.
Yeah.
It's like nine, nine hours to drive through it or something crazy like that.
I don't know.
But some people in Indiana have a Southern accent and like growing
up, you know, some people down the Southern accent and I'm like, dude, you live like two
blocks from me. How did this happen? You know what I mean? Like, right. Like it's, where did
it go wrong? No, no, I'm not saying it went wrong. I'm just saying like, you know, somehow
my family got the, you know, the, the, the Midwest, you know, the Chicago style accent
and two blocks down, some people got a Southern accent.
So, I mean, it probably has to do with their upbringing and their family.
But that's how diverse accents are in Indiana.
Because as you get around Indianapolis, you're either north or south.
I think that's the border.
Well, let me make a recommendation on these pecans.
Because I say pecans. and then we'll get into
some home lab stuff and maybe some ai home lab stuff we'll see you know if we get a little crazy
around here uh the brand i want to recommend is a texas brand it's a family operated brand called
berdoll b-e-r-d-o-l-l or dole i like the best pecans you'll ever have. You can just get the pecans for Christmas
if you want. You can get a bag
or you can get the chocolate covered ones. You know, whichever
flavor you want to go with, but their
pecans are legit
the best pecans ever.
This is not sponsored, just a
tried and true, loved
beloved Texas brand called
Berdoll.
I'll have to check it out.
Damn you, Buc-ee's, for dropping them.
You know, Buc-ee's is a big...
Have you ever heard of Buc-ee's?
No.
Oh, my gosh.
Well, I can't take you there in this podcast.
We'll have to do it as an after show or something like that.
But Buc-ee's is not a gas station.
It is a destination.
Let me just say.
It's on most Texas highways.
It's a beloved Texas brand.
Buc-ee's.
Tim, I'm going to teach you some things, man.
I'm going to teach you some things.
All right.
Let's get into the meat of the matter.
You know, we're friends.
We're talking about Home Lab.
We're talking about some creator PC stuff.
Maybe some Linux workstation.
Some AI.
You know, you just did a hardware tour, a software tour,
what's practical to run as AI in your home lab slash home, whatever.
Where should we begin?
Should we begin at the state?
What is the state of home lab as you see it going into 2025?
Is it growing?
Is it stagnating?
Is it more diversified?
What's the state of Homelab?
No, I think it's growing.
And I think I said this last year
that I think it's growing.
And I think last year I said
that we'd see a trend towards mini,
you know, more mini PCs.
And I think that's right.
And we definitely will.
And we are still seeing it.
So I think it's still growing.
I think the trend is growing because a lot of people are saying, you know, either they want to post up at home, You know, I want to have a media server. I have this extra compute. What else can I do? What else can I throw on there?
And then all of a sudden you have a server rack in your basement. So I think it's still growing.
I think it is. And, um, you know, Geerling just a little bit ago, Jeff Geerling, you know,
had a video on, um, you know, tennis mini racks, which I have a couple up there behind me. And, you know,
I think that kind of opened up the doors for a lot of people to say like, hey, you know, I don't,
I don't need a full-blown PC. I don't need a big server. I can, you know, have a whole entire home
lab in four U of rack space that fits on my desk. As you can see, I have two mini server racks back
there. And so I think, you know, I think that's going to be a bigger trend.
It already is a big trend over in Europe, you know, 10-inch racks.
But in the U.S., they're just starting to catch on.
Yeah, I'm kind of tired of the massive rack.
Major power, too.
You know, having to have this massive UPS to, to deal with it. I feel like it's,
it's, um, it was a good start, right? Homelab kind of began by bringing what was enterprise
to the home to play with it. That's right. Right. Repurposing older enterprise servers,
uh, older enterprise workstations, desktops, et cetera, because there wasn't a lot happening.
Now there's a lot of new cases out there.al is going crazy with all their different designs there's so many others out that they can compare
silverstone is one of my other favorite brands i think when i was watching your um your linux build
i think we are using the same what do you call that case i guess silverstone rm42 502 i believe
is what it was um well What are you using for that?
Well.
We're jumping the gun a little bit, but.
Yeah, no, that one is just, I don't even remember the brand name.
Like, it's like a kind of not a popular brand for that circle.
It wasn't a Silverstone?
No, although.
They look very similar.
Yeah, no, it's actually right there behind me.
I'm actually going to probably move that into a Sligar case,
and Sligiger cases are are
super awesome um you know they're they're one of the few that um you know focus on on rack mount
systems and yeah i i agree like you know i i over indexed in the beginning on compute i thought yeah
i need you know used enterprise servers you, decently priced, cheap in general.
But, you know, most of the time that's way overkill on compute and power and everything else, heat, noise, you name it.
And so, you know, I reeled it back, I don't know, two years ago and moved to mini PCs for my Kubernetes cluster and some of the things that I'm hosting at home. And even before that,
I got one use servers and kind of, you know, I, you know, kind of toned it back a little bit,
even before I moved to those mini PCs. So yeah, it's, it's, it's wild. But at the same time,
like me personally, I'd rather have my stuff in a rack. So having my stuff in a rack. Uh, so having my stuff in a rack doesn't make it, you know, high energy or,
or, or, you know, high power usage. Um, cause I have a lot of stuff in my rack right now.
And I think I'm like at 500 Watts, which is a lot, don't get me wrong, but nowhere near like,
you know, what it could be or what it was, you know, eight, 900. And so for me personally,
I like putting things in a rack because it keeps
things organized. You know, it keeps things where they need to be. You know, cords, you know, have
to go in a certain spot. Power has to go in a certain spot. These machines have to stack in a
certain way, you know, and when I close the doors on the server rack, there can't be anything hanging
out. So for me personally, I like to compartmentalize all of that in one space.
Like, you know, that's how my workstation is right here. I mean, I have a Mac studio
that's racked just because I don't want to have to deal with cords and all of that. I want it to
look nice and, and, um, you know, give it, give it, uh, I don't know, a proper place to put it.
Otherwise it'd be sitting on the floor or some shelf and who knows but i made a
mistake actually when i mentioned my silverstone i think it was not that you might know this because
you said it's not a silverstone but it's a silverstone rm44 that's actually my plex server
case it's got three fans in the front uh great ventilation you know just in case i spin up that
4k footage and need to transcode it to 17
clients you know all that good stuff to manage the cooling of it but it's a studio it's a silverstone
rm44 and it's a 4u case and i'm the same i've got a my preference is rack mount really and if i can't
rack mount it it might look nice like for example the uh the fractal design
uh north a lot of people are liking that but i'm like can i get a rack mount version of that yeah
exactly you know i mean i would all day yeah all day are you there with me on that one oh yeah man
metal wood and stone is all you need right a little a little wood is just so nice yeah i mean
if it says fractal and it has wood in it i automatically like anything that
they post so yeah for sure yeah they they have great designers there i i like it yeah definitely
minimal do you pay attention to case designers i know a lot of people that like really steep
themselves you're a youtuber you're a creator uh do you get into that realm of like i pay attention to a case designer and when they go
from xyz company to fractal and they design their next case i'm on it is that the kind of person you
are you just sort of like nah no i just i just know what i like when i see it and uh and and
you know embarrassingly maybe i i don't know the names of any of the designers who do this stuff i
i i know that there's a great designer behind all of this stuff, but you know, for the most part, I just see the brand, you know,
where it lands. So no, I couldn't tell you any designer's name.
Well, good for you. You're not that deep then.
What is it that motivates you to do what you do? I mean, I think you, did you begin this as a
sort of a side gig to a hobby? Cause you were a software developer and you still are, but you kind of like inched
into this and, you know, maybe did it well and got popular and had some good thoughts.
And then you're like, man, this is kind of cool.
Did you just, is that kind of your story or what makes you get up in the morning and every
day make content?
Yeah.
So that, that is kind of how it happened.
That is kind of how it happened.
Like I've been doing this as a hobby for a long time.
Instead of home labbing, I'm starting to call it infrastructure as a hobby.
That's what it has been for me for a long time.
I did infrastructure for work.
But then, as you mentioned, got into software development.
I've been a software developer for a long time now.
And so now infrastructure is now my hobby.
So, you know, everything I do at work is cloud, a hundred percent cloud, a hundred percent, you know, virtual, like I don't touch anything except for my laptop. And, uh, so I, you know, I, I always have this itch for an infrastructure and it comes from two things. One, the media server that I mentioned earlier, but two, just having, having an access point in a router at home. I
mean, you know, anyone can look at their access point router and say like, you know, either they
love it or they hate it, you know? And, um, if they say they hate it, it's probably rented in
from, from, you know, whoever their ISP is. If they say they love it, they put some time into
it and bought their own. Um, and that's kind of where, you know, I like to play is like a little bit in the networking and somewhat in compute and storage.
And so so even if I, you know, weren't totally in, I, you know, as I am now, I'd still be doing this kind of stuff.
So the thing that motivates me in the morning to get up like this morning, I thought, ah, I get to I get to play with this stuff. So, but the thing that motivates me in the morning to get up like this morning, I thought, ah, I get to, I get to play with this stuff. You know, as soon as I opened my eyes, I thought,
sweet, did my print job complete on my 3d printer because I need that stuff for, you know, these
mini racks behind me. That's one thing I got into the last six months is 3d printing or actually
two months, 3d printing. I'm in deep, but, uh, is that right? Oh man, man, there's yeah. Um,
yeah, it's wild, but, um,
that's what gets me up in the morning. I think just having a variety of stuff to work on,
you know, if it, and if it's not hardware, it's definitely software. Um, you know,
whether it's writing software or using software, I'm in this, I'm in this kind of weird space where
I can write software and I do, and I build that software and run it
in my home lab. But I also use software from other companies, whether it be closed or open source
in my home lab. I have hardware that I built and scrapped together myself for custom builds.
And then I have hardware from, you know, vendors that's full solutions that I use. And so,
you know, I'm,
I'm kind of playing that world. So if there's a Venn diagram of all of that, that's, that's,
that's what I like is, uh, you know, hardware, software, open source, closed source, it doesn't matter. Closed source, it has to be a really compelling story and a good solution better
than anything out there. Um, but you know, and if, you know, that's, that's, that's why some of
that stuff might be closes
because you know they're making a profit and they're making a good product and i'm not afraid
to support good products so but yeah that's that's what gets me going is is really just um
playing with stuff and um thinking about you know what i'm going to tell people about the
stuff that i'm working on that's that's kind of what goes through my head.
Do you go as far as writing scripts too?
Do you,
do you,
do you,
what's your process?
Give me a,
a one minute,
two minute version of like how you do what you do.
Oh man.
Can you compress it into two minutes or is it,
is it not?
It's too hard.
I can cut me off if I go too long.
You can go far if you want to.
I was just kind of giving you just a limit.
I could talk all day. I used to wing it. I used to wing it. And that was fun, except for, you know,
I was messing up every other sentence and it got super painful to edit later. And so for me,
writing a script was more of an optimization for later. So I didn't have to edit longer.
And I noticed the more I messed up, the longer I'd have to edit longer and I noticed the more I messed up the longer I'd have
to edit so I started writing scripts so I I have two or three ways that I do it sometimes I'll have
a script sometimes I'll have an outline depends on the topic right if it's a tutorial outline you
know intro has a script tutorial an outline outro has a script that's kind of how I work now but
there are other videos where I have a full script so So how does it start? Well, I have a list of ideas. Every video,
I kind of vote in my head what's next. I have commitments from external commitments,
maybe from brands. I have the things that I want to do. And then I have the things that I
kind of think will do well. And then I have the things that are relevant,
you know, super relevant to right now. And, uh, so I have to kind of juggle that. Uh, and so I usually pick one of those. Uh, I start depending, depending on the topic, I'll either start
experimenting, uh, so I can think through what I'm going to say as I'm doing it. Um, or I, or I
dive right into writing cause I know what I'm going to say. I'll film it. Um, or I, or I dive right into writing cause I know what I'm going to
say. I'll film it. Uh, I do use a teleprompter if I need to. Um, and then, um, you know, I'll start
editing, uh, while I'm editing, I kind of think of what the thumbnail is going to be. And while
I'm editing, I'm trying to think like, are there pieces I can cut out, uh, to be more concise?
Right. Uh, because sometimes even if I review the script that I'm
going to record, there are still things that slip in that I think later on when I'm editing and I
don't need to say, and then, yeah, then, you know, then it's a, then it's a, you know, polish it up,
uh, put effects on it, uh, upload, uh, you know, you name it, uh, uh, subtitles, um, thumbnails,
uh, three thumbnails now for ABC testing. It's, uh, it's,
it's tough. Yeah. YouTube's YouTube's a lot of work and I'm not complaining at all. It's just,
it's more work than people think it is. Oh yeah, it is. Uh, it's a lot of iterating,
you know, and I think that you mean by work is like even the ABC testing, that's iterations,
like which one is the one that makes people really excited about this video or connects with the people that should watch it
are you um are you tapping into the iphone camera at all are you strictly staying like mirrorless or
dslr like what's your flavor of how you shoot you got any preferences um i use kind of whatever
works so i have i have well now i have
two cameras that i work with actually three so i'm doing top down kind of stuff it's it's back
there that one's mounted because i replaced it with with this one this is a sony mirrorless
fx 30 okay on the line i'm not so i'm not a huge photographer. I realized that a lot of people that are in YouTube are like former photographers or former, you know, camera people, or, you know, not me,
I'm a tech person who turns my camera on. So anyways, uh, I have that. And then my iPhone,
my iPhone is actually really good at shooting B-roll. I have some videos where 90% of it's
iPhone and no one ever boxed at it one day. Oh yeah. So yeah,
being able to,
you know,
record this easily,
you know,
as long as I have enough light,
it has stabilization built in.
Like no one even knows.
I know because I know what to look for.
But no one knows or no one cares.
And to be able to do that is game changer.
I really wish like,
you know,
mirrorless cameras or DSLR cameras would
kind of catch up and not so much on the optics because they're way beyond, but more so on
connectivity, like, you know, store stuff, automatically send it somewhere when I'm done,
or even do, you know, NDI, send it over the network somewhere to my nas so i don't even have to worry about you
know sd cards or anything anymore so i don't know i i'm i'm hoping cameras take a huge leap
at some point i mean they're they're already there like super you know expensive high-end
cameras can do all that i just feel like i don't know the interface is just so old whenever you
look at a camera the interface is just so bad and so old yeah i agree with that it's primed for disruption uh self-disruption hopefully i'd love
to see maybe even ethernet cable you know on a camera you know like connected yeah that's real
time bandwidth 10 gigabit you know 100 gigabit 50 gigabit whatever you got on your network
straight to your nas you know that, direct record like a computer.
It is a computer.
Why not be a computer?
I agree.
I agree.
And it would be awesome.
So if we're going to go down this path, make them as dumb as possible and then put software
on my machine so that I can push software to it.
Basically a controller, you know, you make, you make the camera dumb and then, you know,
and then make some kind of software smart.
So you can just push settings and push everything to that camera.
It just captures it and dumps the video wherever you want.
But yeah, similar to access points, right?
They're pretty dumb.
Use a nice GUI to configure them.
Push settings to it.
Software-defined network.
I want camera-defined?
No, I don't know. Software-defined cameras. There we cameras there we go there you go yeah i'm with you on that i'll uh i'll vote on that
if you can get that bill passed yeah no the camera makers will not have it because then
they become a commodity right yeah they gotta control the market it's all about my processor
and my you know yeah i mean it's I mean, it's a game for sure.
Yeah.
I don't blame them, but.
Do you, one more thing on the iPhone style shooting.
Do you do just a straight camera application or did you do something else?
Like the Blackmagic, I think, has a pretty sophisticated where you can mess with the F-stop and shutter speed.
Like, do you go crazy with it or you just sort of like open up camera app and you're done yeah so for me it's uh camera app auto and i'm done i i'm a simple yeah so i'm i'm
you know the one thing i do use you can kind of see it back there is my gimbal so the dgi gimbal
back there and they have their camera app built into um but you know i don't um i don't change
the settings i just hit record and it's auto you know, I don't, um, I don't change the settings. I just hit record and
it's auto, you know, it might, it burns me sometimes on, on, uh, the white balance and
stuff like that. But, you know, I started thinking about like, like my goal this year is to turn
around content quicker. Right. And so I need to stop sweating the small stuff. That's, that's,
that's big for me. Like I was just talking about white balance. No one is going to care sweating the small stuff. That's, that's, that's big for me. Like I was just talking about
white balance. No one is going to care about the white balance. And if they care about the white
balance being a little bit off, you know, they're, they're either super interested or maybe not the
target audience, you know what I mean? Right. Or it's that obvious. And so I need to stop sweating
the small stuff on, on a lot of my videos and just get them out quicker.
You know, that's a trend I see too.
Like people are starting to gravitate more and more towards less polish, you know, more informal, more spur of the moment type videos, you know. And so I kind of do a bit of all three.
And so, you know, just trying to figure that out for this year.
I just want to be able to turn videos around a lot quicker.
And I think the way that I do that is to make a lot more informal videos, you know.
Yeah, I'm with you on that.
I think, as I mentioned in the pre-call, I think I might have been in the show.
I'm not sure.
That this year, 2025 for ChangeLog proper is video first.
We're taking our podcast full length video.
That's ChangeLog News. Our two flavors of our, you know, large long, long form show, both on YouTube full length.
Our news show is nine minutes or less, and it's, it's already hitting close to 2000 views. Like
2000 views is pretty good. We just started this year and it's, it's approaching 2000 views for
that news show consistently. Our shorts and clips, those things are always like in the hundreds on the, you know, maybe
sub, sub one thousands.
There's some that are breakout hits in the 50, 60 thousands.
We've got one video out there from the, we're gonna actually recycle, uh, because it's just
that good about AI and IP law and stuff like that.
And music, it's, it's got all the right touches and feels, you know, that one has done more
than a million views.
Yeah.
So you got this really great piece of content.
Let's just say, what's in my hardware rack for 2025?
Let's say that's what you've done, right?
Let's just hypothetically, maybe you've done a video like that.
You put a lot of work into that video.
Maybe it's 20-ish minutes.
You got good chapters in there.
You got all the right things.
You don't just leave it there though, right can turn your camera your iphone camera let's say
vertical and do a short that is a side promotion of that content oh yeah a little informal you
know i feel like we need like this hub and spoke mentality you got this hub which is this larger
longer form you know kind of thought out content that's chaptered and,
you know, linked up in the description. Then you get these sidecars, you got to go on LinkedIn,
you've got to go into the shorts, you've got to pull some clips from it. Maybe you got to find
some way to promote the, the, the hub, do some spokes to promote the hub essentially. Is that
some things you're doing? It is, it is absolutely. And yeah, I agree. Shorts, you know, are always for me, super informal, right?
Sometimes I might write from a script just because, well, I used to only have 60 seconds.
Now you get three minutes.
YouTube finally, you know, folded into what reels and TikTok's been doing.
So you get three minutes now.
Yeah.
Which I think is awesome because before I was like trimming off seconds, you know, half seconds between words just to get it into fifty nine point nine seconds just in case, you know, because then they wouldn't consider it a short.
But but yeah, that's exactly what I do. I you know, I have my long form polished content on YouTube proper. And then I take it even one step further. I live stream on Twitch every Saturday, right?
Totally informal, totally Q&A, AMA style. I never know what's coming. I promote that other places.
And then, yeah, absolutely. I upload my videos to different, LinkedIn, X, Facebook, you know, you name it, um, upload those there as well.
And then, uh, you know, Instagram reels too. And, um, you know, and then there's content all the
time that, you know, I'm, I made from, uh, that week of, of, of shooting the video. So yeah, it's,
it's, it's kind of weird. Um, because you just, it's, it's a lot of sharing stuff, right?
And hoping that stuff sticks or gets shared again.
And it's kind of weird, you know, I don't know.
For me, it's sometimes it gets kind of weird because it's, it just feels like sometimes,
like, and I don't want people to ever think like I'm showing off, you know what I mean?
And that's kind of what, that's not me at all. Like I never want to show off stuff,
but if you see my YouTube stuff, you're like, that dude's a show off, you know,
what the heck is he talking about? Why is he showing me all this stuff? You know? Um,
I, I mean, maybe, and I, you know, some people have, have, have thought that who didn't know
who I was and didn't know that I do all of this,
but maybe just saw my video, the vertical one for the first time. Um, and so it's, it's just a weird,
it's a weird space for me to be in. Uh, cause I, you know, it's, it's, it's like self-promotion
on, on steroids all the time. And, you know, in my, my private life, that's not how I am.
Yeah, I do. I do feel that angst as well, you know, honestly. And there's, there's two pieces
of, I wouldn't call it advice, maybe just conversation. One is give them what they came
for. That's one of our pillars. Let's just say, you know,
core belief, so to speak around here, right? Give them what they came for. That's got a lot
of things like don't make the intro so long that it takes too long to get to the content.
Time to content is essential. You know, whatever is valuable, the hook, et cetera, you know,
give them what they came for. I think people come to you and they want that, you know, start of the year. They want that maybe mid year and, uh, kind of content from you, like,
which is show off content, show me what you've got to show me the choices you're making,
which is like that show off feeling, but really you're just giving them what they came for,
right? You got an audience who cares about your opinion and you kind of have to show off. And so I kind of feel that for you. And you know, when you do that, if your default gear in life as a
human being and your persona and your personality is not a show off personality, then that's going
to be a bit foreign to you, right? You're going to feel a little icky. You're going to feel a
little too self-promotional and you get some haters, right? And my second piece of advice is let them. Now that's not my thing. That's Mel Robbins. Do you know Mel Robbins?
No, I don't.
Oh my gosh. She's awesome. She wrote this book. This is non-sponsored. I can't wait to read it.
I've heard talk about it. I've been paying attention for a long time, but I don't know
how to describe Mel Robbins this time. I think she might be a psychologist. She might be a
motivational guru. I'm not really sure, but she has been through the ringer, let's just say in life,
and she's bounced back and she's got lots of life advice because of the way she's changed.
And she has this new book out called Let Them. It's called the Let Them Theory,
as a matter of fact, is what this book is called. I haven't read it yet, but I identify with the
premise, which is you got some haters. You got some people want to say
something about you that, Oh, Tim, yeah, you're just showing off. Let them, somebody doesn't like
you. Let them, let them not like me. I don't care because we care so much as human beings more than
anything about what other people think about us. And the reality is in almost every case,
they're not thinking about us at all. They think about themselves.
Right.
And they want to hate on you.
You want to change your ways or not show up for your audience or show up in the ways that you want to, to, to just be the, the person you are on YouTube or the different places you're at and be the human being you are.
Someone, someone wants to hate on that, man.
Let them, let them, let them hate on it.
Yeah.
You know?
Yeah, that's good, man. Yeah. I like it. Yeah. i like it yeah it's uh you know i just i have that internal struggle and and
even if even if people didn't say anything too you know i personally i'm just like you know what i
would i walk outside and be like hey guys come look at all this stuff i have you know running
in my basement you know that's kind of how I feel like sometimes on social media. And it's like, ah, yeah, you know, but, uh, but no, it's, uh, it's awesome. No
complaints whatsoever. Cause man, uh, you know, most days of the week I get to wake up and do
whatever the heck I want, you know, and it's awesome. What a joy, right? What an absolute
blessing. And yes, it's almost like if someone says, Tim, how's your day? Well, am I healthy?
Is my wife healthy? Y'all don't have kids, but I have kids. Are my kids healthy? Is my dog even,
you know, like, is my dog healthy? Are my people healthy? Do I have the opportunity to do what I
want to do this day? Man, it's a great day. That's a great day, right? I agree. I agree. Yeah. I've
been blessed for a long time with people being
healthy in my family and at the same time, having so many opportunities that I've had.
Well, at the risk of getting too deep in the details here, man, let's truly talk about HomeLab.
I feel like for me, there's a couple of things that I've been resonating with.
I would say the tail end of middle last year, 10 of last year and the beginning of this year.
One, I'm like desperate to create a creator PC, but then I think I got to put windows on this thing and I'm just like, forget it.
You know, I want to build it and I want to, you know, to tinker with all the parts, the GPU, the CPU, the motherboard, all the parts.
Right. I want to have fun building it.
I want it to look cool.
I want it to be a showpiece, but I also want it to be performance.
But then I'm like, man, Windows?
So I'm a Mac guy, true and true.
So I'm like, I can go buy a Mac mini that's basically like,
sure, it's overpriced on some of the hardware,
but I don't have to worry about drivers and BIOS and configuration, which I don't mind doing in the Linux world, but forget it on a Windows PC.
That's resonation one is I really want to build a creative PC.
And then my second thing is I've we've talked around the software of artificial intelligence on our show for years now.
I mean, we have a show called Practical AI.
This part of the network, it's been doing it before it was artificial intelligence. It was data science then,
machine learning then. It was not artificial intelligence. Now it is. That's how long we've
been steeped in this world of AI, you know, pre-GPT, anything really. But I really want to run
AI in my home, like a lot of people. And I think thanks to Ollama, Open Web UI,
and a lot of these advancements around open source models, or at least openly available models, whether truly
open source or not, I feel like has now crossed that chasm that I'm compelled to build an AI
home lab. And right now I have one. It's so embarrassing. It is an Intel NUC. It runs all on the CPU because there is no GPU in it.
And Olam will not recognize the iGPU at least on the thing.
So it's just straight up CPU.
I mean, I can run some, you know, 1.5, maybe 5 billion parameter models.
Okay.
But anything beyond that is just like, forget it.
Right.
So those are my two resonations is like creator PC and some form of AI home lab.
I feel like we're going to self-host AI in perpetuity at some point.
Someone's going to crack the nut on this machine.
So those are my two subjects.
Which one do you want to talk about first?
Let's go with the AI one because that's, you know, I mean, yeah, let's go with that.
I mean, so I've been doing it for a little while and it doesn't take as much as you think it does.
I mean, you've already been doing it and you've been doing it with this small 1.5 billion parameters and it's running okay.
And for the most part, you can run it in your home.
I mean, you talked about Olamo, which is fantastic, which kind of opens the gates for all of these other LLMs and gives you an easy way to swap them out.
And then OpenWeb UI really takes that to the next level where it's like, okay, now I have
GPTs and helpers and basically UI to do all this stuff with Ollama.
And so you could do it very easily.
If you have an old gaming card, if you had an old gaming PC that's, I don't know, 20,
30-ish series RTX. Perfect. You know, um, it's
probably only going to have about eight gigs of Ram, uh, but that's more than enough for some of
the smaller LLMs. And, uh, I think that's the most important thing I think in GPUs is, is your VRAM.
Uh, because if it's, if it's small, I can't fit the whole model inside.
And so it's going to be paging that.
And so you want to look for cards that have enough VRAM to fit models in them.
And so there are expensive options like even 3090, 4090, 5090,
but there are budget options.
I think the 3070, so in that Linux workstation build video I did,
and that was focusing on AI or really LLMs.
I wanted to build a workstation to play with them.
I think 3070 is the budget pick right now.
One, because it has 12 gigabytes of VRAM.
And two, because the price.
And so when I say budget, we're talking sub $300.
That's still a lot of money, don't get me wrong.
But it's not the $5,000, $8,000 that you're going to pay for cards where the compute end doesn't really matter.
And so with 3070s, if you can find one, especially used or something, you can load big LLMs on there.
And not only one, you could run multiple.
So I've been running LLMs at home, uh, for a little while now. Um, I've been running,
you know, open web UI, uh, to kind of have my own, you know, local chat GPT, if you will.
Um, and I've also done some stuff, you know, even as simple, um, maybe not so simple, but,
uh, voice transcription, you know, with Home Assistant.
And now Home Assistant has, you know, an AI voice assistant. But prior to that,
I hooked up Bulama to Home Assistant to make Home Assistant even smarter. And, you know,
the most basic example is, you know, I could ask home assistant, you know, how many
lights are on in the house? You know, simple question home assistant prior to me hooking up
to LLM would say, I don't know what room that is. You know, it has no idea what I'm even saying.
So then I hooked it into Olamo, uh, use any model you want, really, honestly, and then ask that same question.
And it said, you have 17 lights on in your house.
And that's just it just blows my mind on how hopefully most things we run will let you
plug in your own, you know, model or engine API endpoint, if you will, because that's
what it is.
And that's yet another thing I want to get into a little bit more is, you know, I can do this stuff in the GUI with open web UI and, you know, mid journey and generate graphics.
But for me, the next piece, besides trying to train is having having an LLM API. REST API that's on Ollama, right? So I can feed it text and do sentiment analysis or
ask it to summarize stuff, you know, through an API endpoint, which then I can build any tools I
want in, you know, in the code I want to write and then have that backed with an LLM of my choosing.
And so that's where I think it's super powerful. And that's why the whole deep seek thing is kind of blowing up too.
But being able to do some of this, and I don't even want to say at home.
I just want to say being able to self-host your own LLM, whether that's in the cloud,
in your home, at work, but having control over that LLM and not going out to open AI and using their API, but using your own API on your own model that's trained or maybe a public open source one is, I think, going to be a huge game changer for a lot of people, for a lot of companies.
A lot of companies first, I will say.
But it will impact all of our lives as it already is.
Yeah.
I think on-prem AI is the necessary next step.
And we, I mean, I would potentially even pay for a license.
Like if OpenAI is the winner or if DeepSeek is the winner, like whoever is the trending, like we choose Intel, we choose amd we choose a brand so whatever brand we as a
society or techies or geeks want to choose like if it's not going to be open source like literally
where i can download it myself i would love it where i can at least license it you know like
you would software and say okay well if if o3 truly is the best or o3 mini is truly the best
and you can give me an on-prem version and I have a licensing scheme or
something.
And I know that you're not hoarding my data or sniffing my stuff or training
other things on top.
Like if you can give me some,
some agency,
then I'm,
I'm for that too.
My preference really is,
but you know,
it's hard to have this as a,
as a,
as a,
as a hard preference
because I know how much money goes into training these models.
There was speculation that DeepSeq was only, you know, a few million,
which everyone was like, there's no way.
How did they do this at such a cheap cost?
Well, that was actually just the GPU costs.
That was not the true actual cost of training it,
which was really speculated to be truly in the billions,
similar to OpenAI. So that's the one major trend this year. And thanks to Jared, my business
partner and co-host on the show, because he shared that in ChangeLog News on Monday, which was,
you know, really, I love that show. I love paying attention to our own content. I'm up on the
latest with Dev News when I listen to ChangeLog News on Mondays, every single Monday, by the way.
Well, friends, you can now build invincible application thanks to Temporal, today's sponsor.
You can manage failures, network outages, flaky endpoints,
long-running processes, and so much more, ensuring your workflows and your applications never fail.
Temporal allows you to build business logic, not plumbing.
They deliver durable execution and abstracts away the complexity of building scalable distributed systems and lets you focus on what matters, delivering reliable systems that are faster.
An example of this is Masari.
They are the Bloomberg for crypto.
They provide market intelligence products to help investors navigate digital assets.
And they recently turned to Temporal to help them improve the reliability of their data ingestion pipeline.
And this pipeline collects massive amounts of data from various sources.
And then they enrich it with AI.
This process previously relied heavily on cron jobs and background jobs and queues,
and the design worked well.
However, these jobs were difficult to debug at scale
because they needed more controls and more observability.
And as they looked to rethink this ingestion flow,
they wanted to avoid cron jobs, background jobs, queues.
They didn't want to create a custom orchestration system
to oversee and to ensure these jobs and work was being done reliably.
Here's a quote.
Before Temporal, we had to code for dead letter queues,
circuit breakers, etc.
to ensure we were resilient to potential system failures.
Now we eliminate these complexities.
The headache of maintaining custom retry logic has vanished by
using Temporal, end quote. So if you're ready to build invincible applications and you're ready
to learn why companies like Netflix, DoorDash, and Stripe trust Temporal as their secure and
scalable way to build and innovate, go to Temporal.io. Once again, Temporal.io. You can
try their cloud for free or get started with open source. Once again, Tem.io. You can try their cloud for free or get started with open source.
Once again, temporal.io.
I want to self-host.
I'm seeing the future of where this is going.
I feel like I want to self-host AI.
And I want to go as far as I want to have quad GPUs.
Okay, Tim, I want to go,
I want to build an open rack with good ventilation.
I want to like, you know, I want to deck it out.
I'm seeing the future where this is going.
And that build is like 4,000, maybe five, you know,
four or $5,000 for that kind of build.
But so far, all you can do is do some of the things
that you're doing now,
which is like home assistant automation and stuff like that. What do you think about this
world where we're going to eventually have an appliance maybe even where we're self-hosting?
I feel like someone's going to be like, let me simplify this for most of these home users. Let
me one single button this thing, because we're not all techno Tims and Adams out there. Like
they're not going to build these machines. They may pay the price for it, but I kind of feel like the next major trend, even for non-geeks like us, people who are just like
everyday folks, they're going to eventually get to the point where they're saying,
you know what? I want to have some agency over my AI and I want to have an appliance in my house
that I know I can trust. What do you think about that? Yeah, no, I agree. I agree. And I start,
I see this slipping in a little bit. So it's starting to creep in. So a lot of, I agree. I agree. And I start, I see this slipping in a little bit. So it's starting to
creep in. So a lot of, I guess, NAS, I shouldn't say a lot, but some NAS vendors are now, you know,
positioning their NAS as, oh, it can also do AI because they can put a video card in there,
run an LLM and there you go. Although, you know, that's still, you know, developer mode for most people.
But, you know, if you install open web UI,
Oh Llama and it comes in a nice package and a, you know,
and a Synology or whatever, whatever NAS you want,
it has a video card in there and it's like, okay, cool. You know, that,
that works, that's a full product. Right. And so I, I think that, you know,
people wanting storage at home and also, you know, those companies seeing that, hey, AI at home is cool too, I think it's going to start going in that way. But I will say that, well, also NVIDIA had that, whatever it was, I forget what it is, H100 or whatever it is, it basically has like crazy-
That tiny thing?
Yeah, basically like crazy, crazy, you know,
compute for,
for a reasonable price.
I will say it would like destroy anything.
H H 100.
I can't remember.
I think it was a Jetson nano.
Is that what you're talking about?
No,
no,
this is something different.
Oh yeah.
They announced this,
I don't know,
two weeks ago.
It was right before they had a press conference right before I think CES.
Okay.
You keep talking.
I'll do some Googling.
So there's that.
But again,
this is like a developer tool,
right?
And so it's going to take developers to do this,
but that thing is small and compact.
And so before you go spending,
you know,
four or 5,000 on GPUs,
I'd say,
you know,
look at that first.
Uh,
uh,
because this is like,
this is like compute on steroids for anybody to do it themselves.
And it's, it's, it's way way beyond anything you could possibly build for the money. But I still think there's going to be, well, two things.
I still think it's going to take a killer app. It's going to take a company to put it together
in a package and create full product, right? It's not going to be, you know, hey, use this software along with this video card
and put it in this hardware
and then you're going to have AI.
No, I think it's really going to take a company
that's going to do full product.
And, you know, that could be anyone.
It could be companies we know already
or it could be companies that are, you know,
just starting out.
But I will say though that, you know, I think where this becomes, you know, just starting out. But I will say though, that, you know, I think
where this becomes, you know, super useful as soon as we get, I think we have it now, action support,
you know, action support is really going to like change our lives even more than just LLMs already
have. So if I'm able to tell, you know, a helper or a GPT to go do something for me,
that's where this is going to be huge. And I think, I think chat GPT just announced something
about operators. Yeah. Yeah. So they're already getting action support. And so I would love to
be able to, like I do in home assistant say, Hey, you know, turn on these four lights or turn off
these four lights. I would love to be able to say things that, you know, are kind of tedious to do,
but I want them done. And maybe it's, you know, I don't know, maybe it's Google and Gemini, but,
you know, hey, you know, I don't know, summarize my emails. Tell me the most important ones right
now. You know what I mean? Or do I have any emails that, you know, are super important or
critical that I should look at? I mean, this is all business use case.
But if you think like imagine if you had that helper on your desktop and this might be going too far for some for some people.
Right. But imagine if you had software running on your desktop that was your helper.
It was your AI helper. Jarvis.
Yeah, I mean, I mean, well, yeah, but I mean, like, it could do things. Like, it could launch VS Code.
It could launch whatever.
It could go and patch your server.
Like, this is getting probably too techie.
But there are so many implications of, like, having a helper and action support be able to do things for you that I think it's going to drastically change stuff.
Like, even right now with Home Assistant, well, Jarvis can, I think.
But when I hooked up the LLM to Home Assistant, it couldn't do any action through the LLM
because I don't know if they're awaiting on action support.
They were like, yeah, this only works with Google Home and something else.
I'm like, wait, like you can't turn off my lights on my thing because I told you to, but I have to go
through Google home to tell Google home to do it. Kind of silly. I'm sure they're working it all out,
but I don't know. And you know, I'm just thinking of just tiny actions here, you know, commands.
I'm sure the list goes on and on and on of things that people can think of or do.
Um, that I think, I think, I don't know. For me personally, I'm excited about that.
Having a helper to say,
yeah, go do these couple of things really quick
and let me know when you're done.
Because you probably can relate.
I'm a person of one.
You're a person of one.
You have a show and a company to run.
And I have a channel to run along with work on the side.
And I can use
all the help I can get.
And so if I can rattle off a few commands to an AI that I trust to do those things for
me and do them right, a lot of stipulations there, that would be great.
That would be great for me.
And so I'm really looking forward to, to some of this because yeah,
as you know, like, you know, I, I could use more help and so far like having an LLM, like GP chat,
GPT, uh, for me personally, um, or even a local Lama has been like super helpful, uh, for the
person of one, uh, a creator, uh, just to have some ideas, just to have like an assistant, you know, that understands
everything you say to it, that has maybe, maybe or maybe not the same perspective that you do.
To be able to ask it questions and get feedback on exactly what you're doing is, has just been
like game changing for me personally. Yeah. It's, I don't even Google anymore half the time
because
I don't know if I should go this far, but I'm
just kind of rambling about AI, but I
rarely Google anymore for things because
I don't want to see advertisements.
I don't want to see Google's
Gemini slowly typing things.
I don't even want to click on the link
that they suggest, even if it's right,
because then I'll have to sift through that website to find what I want.
And so I've gotten a lot more efficient by using GPTs because I can get, for the most part, a really good answer really quick, and I don't have to shift focus.
I shift focus once to say, chat GPT.
I'm back to doing what I was doing.
If I go to Google, so I went to Google.
Now I'm getting ads.
Now I'm clicking on this thing.
That wasn't the right link.
Oh, it took me to Reddit.
What are all these people saying?
Let me fish through these comments.
No.
You know what I mean?
And so like-
I'm so with you.
There's so many focus.
There's so many chances for it to steal my focus.
And that's marketing in general.
That's what they want. That is exactly what they want. And so I'm glad that ChatGPT is here to
kind of disrupt that because I don't want to be served up advertisements. I just want to focus
on what I want to focus on. And if I need some help from an expert, that's what they're there for.
That's what I feel like.
I think it is the next frontier.
That's why I bring it up here in this, you know, home labby friends conversation is because
I feel like once you have, you know, I know the GPUs are even hard to find, right?
Like they're hard to find.
They're expensive.
I feel like it's a racketeer system.
Like there's something happening there where, I get it, everybody needs something GPU to do the next big thing they want to do, eBay pretty easily. 3090s are in the $900 to
$1,500 range for a decent used one. And if you know how to eBay, then you won't get scanned or
you won't buy the wrong one or buy somebody's junk basically. I'm thankful that eBay has gotten
better at weeding out the poor sellers. And you still will have someone who doesn't know how to eBay well
buy from a poor seller. They don't understand how to use the ranking system or look at feedback to
make sure they're a proper seller. And even that is still hard. You can still kind of not get
scammed, but you just buy something that's less than what they say it is. Thankfully,
the eBay guarantee, this is not an ad for them, but I use eBay a lot for aftermarket products.
I usually win auctions I involve myself in.
I can tell you how I do that if you want to know.
But you've got to learn how to look at a seller and evaluate it well.
The eBay guarantee does say like if they say it's new or it's an open box or they say if they if they they misidentify what it is or let's say it's a
version they say it's a version two but it's actually a version one well the ebay guarantee
protects you as a buyer from that and ebay has gotten so much better at enforcing this that it
for me it's a fairly trustworthy place to buy things albeit you can still buy the wrong thing
from the wrong person and maybe you have some challenges
with getting your money back or getting a replacement but my adventures in ebay lately
have been mostly positive mostly positive and uh that's where i'll say it from there that that's
my caveat to say like gpus are hard to access but I feel like the next frontier really is like, okay, I can self host even the geeks like us for now.
We can begin to say,
uh,
flesh out this world running AI locally on a decent box or a really beefy box.
To me,
seems like the next frontier.
Oh,
Lama seems to be the centerpiece of enabling most,
if not all of this,
because it's the vehicle.
It's the index online, oloma.com. You can go there and find the latest models. You've got
downloads, you've got the parameters, you've got clipboards where you can copy and paste to
your terminal or inside of OpenWeb UI. They're making it really easy to find
models you can play with locally. And the API that OpenWeb API offers, or even Alama offers,
like that whole API scenario, being able to tap into Home Assistant,
like your video on that opened my eyes up because I was like,
wow, you can now voice command Home Assistant where once before you could not.
And all you need is probably a simple, a really simple model
that understands how it could be a very small parameter model.
So maybe that Home Assistant box be a very small parameter model. So maybe that home assistant
box is a very small box. Maybe it's all CPU even, because it's like such low pressure AI. It's just
like how many lights turn on kitchen, you know, give me this smarts where the smarts were not
there before because the API is there. Then for me, I'm thinking like, as a, as a person who wants
to simplify my time, we quote somebody on working with us
as sponsorship level or partner level every single day. And it is a time consuming task
because I haven't found a way to automate it with a level of high touch. I want to give everybody
who works with us, I can easily just like, you know, copy paste from something,
but I feel like everybody needs a little bit of extra attention and detail. I would love it if I
can train an AI or just rag it essentially and tell it, okay, this is our proposals. This is
our pricing scheme is how we do things. And me just tell it what I need. And I tell it the format
I want to back. And it gives that back to me pretty much errorless.
Pretty much.
I'm still going to review it.
And, you know, like those that like really good applications,
I can see myself doing,
but chat GPT is not trying to optimize for that for me.
You know, maybe I can do it.
Maybe they're trying to do it,
but at the same time,
now I got to give them all my data.
And sure, I know we've been Googling forever
and they've got all of our information anyways.
And sure, they know exactly where you're at in the world according to change all news this past
monday uh but i feel like this local run you know privacy focused scenario with this with the things
we're asking ai these days is is sort of the next frontier and i'm hopeful for you because you're
like steep in this home Homelapse stuff.
So people are probably just pouring into your channel
thinking, how do I run this stuff?
Is that what you're seeing?
Yeah.
You know, it's an odd topic on YouTube.
I'll say that.
It's an odd topic.
Odd?
Yeah, odd.
Yeah, yeah.
Because it has the potential to go, I think, either way.
Either people are burned out and don't want anything to do with it. Well, there are people who don't want anything to do with it. There are people who are burned out about hearing it. And then there are people who might want to hear about it and want to do it. So, you know, it's always like every video. It's a crapshoot talking about those things. But there are people. Yeah, absolutely. I mean, that for me, that video, you know, is performing pretty well. And people have asked and there are other creators talking
about this, too, especially the whole, you know, the deep seek thing really got people interested
in self-hosting AI because they wanted to figure out and play with this deep seek thing. And the
easiest way to do that without, you know, sending all your data, wherever it would be, you know, online was to host it yourself. So I think that, you know,
oddly enough, I think that deep seek helped, you know, people realize that they can self-host
LLMs probably more so than Olam has ever done. You know what I mean? Because it, it, it, it did
two things. One, it, it let people know that it's, it did two things. One, it let people know that
it's a possibility. And it also let people know that, hey, it's private. And those are two things
that I don't think a lot of people knew were possible. And nothing against Alam. It's fantastic.
It's great. But it'd be hard to market them, uh, or that,
that product or service, you know, in the way that deep seek, I think marketed it, you know,
all, all of, all, all the, the deep seek hike really hype really, uh, you know, was marketing
for other stuff too, uh, which, which drove so many, you know, so many people getting interested
in this, but, but yeah, I, uh, I, I think, I think that's what people are going to do.
I honestly still feel like there's an opportunity for a company to put it in a bow. And there's
going to be that opportunity, I think, for a long time and do whole product. But in the meantime,
we can do this ourself. I just hope that companies continue
to allow us to plug in our own models. And one of the things that I talked about in that video,
too, was that I kind of hinted at it, but I use Mac Whisper for my subtitles and stuff to get them
transcribed right because I care about them and um and i think a lot of people
should do uh and that's my pitch for you know accessibility in general you should care about
them because there are a lot of people who can't hear or have hard of hearing or just don't speak
english and they want to be able to read english so anyways mac whisper is an app it's it's freemium
you can get it you can download it run it your Mac. It does great with the small models. But in there, I saw I paid for it. I paid for it myself. So I use it.
And no relationship to them whatsoever. But I kind of talked about them in the video where I said,
like, it would be great if a lot of companies let you plug in your own endpoint or plug in
your own models, you know. And I kind of showed their screenshot of them only allowing
chat GPT. Well, sure enough, like shortly after now it allows Olam. So I mean, I probably had
nothing to do with it. It was probably on the roadmap. But that's what I want to see. I want
to see like, instead of application developers thinking like, hey, how can I hook into chat GPT
so everybody can now use chat GPT? Well, think about Olam and things too. Think about, you know, how you can offer that option.
And I think that's a huge differentiator too.
If someone developing software right now says, yeah, we can plug into Ollama, you know, that's a differentiator.
Every software company right now is saying, yeah, we can hook into ChatGPT.
Just put in your API key.
Here you go.
You know, but I also think that they should be including Ollama for sure. For sure, because Ollama can be in the cloud key, here you go. But I also think that they should be
including Ollama for sure. For sure,
because Ollama can be in the cloud too, right?
You can spin up an Azure server.
I was even tempted, like, okay, if I want
to play with some H100
or just some sort of really thing
I can't afford or have access to
in terms of a GPU, I can go to
Azure. I can spin up even a Windows
box, which is kind of crazy. You can spin up even a Windows box, which is kind of crazy.
You can spin up an actual Windows machine in the Azure cloud and play with it as if it's like a local desktop.
You can, you know, VNC into it, whatever, remote desktop into it.
And you can install things on it.
Obviously, you can give it access to GPUs.
You can do a lot of cool stuff.
And you might spend a hundred bucks on that front run because you're maybe renting a really expensive GPU, but it's better than thousands you may not
have or want to spend on a GPU you can't even get access to, right? Yeah. Yeah. Yeah. That's for
sure. Yeah. But I feel like that's, um, that's kind of cool that you can do that. I feel like
Olam is, and maybe thanks to deep seek, uh, you know, we've been
talking about Olam for a while, uh, but I've just never actually been curious enough to play with
it. I just, for whatever reason, just haven't, let's just say, but now I'm in this kick of,
like I mentioned, like I got two things resonating with me. I really want to build some sort of
AI home lab. And thus far, I'm just using what I have. Cause I think that's where you should begin,
right? If you want to know where to begin, what do you have?
Play with that.
Even if it can't run anything super powerful, begin where you are basically.
But now I'm curious enough to be like, well, would it make sense for me to take my hard
on dollars and invest in hardware?
One, for just curiosity.
Two, maybe we can get a sponsor to pay for it.
Three, maybe we can make some content from it.
But four, like just have this AI as a service with models I want to swap out as this becomes more and more popular on my own network.
Tie in a home assistant like you've done.
Maybe even side train a model where I can take this really small model and say, this is the way we propose things.
And these are all of our contracts over the last two years. And let it have that source of truth. And now it's just super smart with the way we do
business. I don't want to give that kind of, sure, I could probably do that with ChatGPT,
but man, wow, what an exposure point, right? Here's all of our contracts over the last two
or three years. Could you imagine that? Like, no, I would feel much more comfortable doing it. And I think that's the angst is like,
everyone has to give up some version of privacy
to play with artificial intelligence
or you have some version of plagiarism.
Well, I'm sorry to tell everybody,
like AI is here to stay.
Like there's no putting that genie back in the bottle
and you can be against it if you want to be
and you can ignore it if you want to,
but you will be you will be behind
you will because young folks and i don't want to say just young folks but like people that are
born into the world today with the way technology is they don't know any different
you and i tell me grew up probably in the dollop days you probably remember how aol sounds right
goodbye yeah i got that down we've got that They have no idea what that is, right?
No, I agree.
Yeah, I totally agree.
And, you know, like, you know, I feel like there's this stigma for the people who are against AI.
I feel like there's a stigma of those things you talked about.
Plagiarism.
It's not your own thoughts.
You're not being unique.
You're not using your brain.
You know, all of these things I hear people say every now and then, you know, your brain's going
to turn to mush. You know, the things we've heard about TV for, you know.
Truth.
Or metal or rock, you know.
That's right. Yeah, exactly. But what I see is I've always, and I think I said this last time,
but I always think of like, when I use a GPT or a helper, chat GPT, whatever, open web UI, when I use an LLM to ask it questions, you know, I'm usually thinking of it like a rough draft, like a rough draft.
Anything that it spits out to me is a rough draft.
It might be ideas.
It might give me new ideas. doing work for me,
I think that's the perspective I have.
I kind of relate it to 3D printing and at least not at home.
Let's say you're a machine shop, right?
And you want to be able to produce something really quick
and you have some ideas and you want to test something out, you 3D print it. You look at it, you test it out,
you think, will this work? You make some adjustments. And then if you like it,
that goes to production and you produce the real thing. Well, for me, at least,
ChatGPD is kind of like that for me. Help me out, give me some ideas. Maybe there's things I'm not
even thinking about, but that's my rough draft. I'm going to take that and use it probably in my final product,
but it's not the final product. And that's the way I look at it. And you could get so much done
so quick just by having the right answers almost all the time. I tell my wife because she's just
very early on. She'll ask me questions about it it she's only used it a handful of times but i i say to her imagine if you had a friend on slack that knew
everything you were saying to it that understood you know almost everything you could possibly say
to it that could help you out with anything you ask it and is usually right.
And we'll give you a pretty concise answer every time.
Imagine if you had that slot friend in Slack that you could DM on the side,
you know,
that's how I kind of treat chat GPT,
you know,
um,
and,
or,
or any LLM because it's,
it's just so refreshing to have.
I don't know.
I,
I work on really weird stuff.
I work on home lab stuff.
I work on things where the same question is probably been asked twice ever. Will this GPU fit here because I only have this amount of clearance in my server rack? I'm not going to find that answer on Reddit. And it's going to take me a long time to go and get the dimensions, go and measure, you know, do all this stuff. Or if I could ask, an LLM that, and it knows it's like, oh, here are the dimensions. Here's the height on this case.
Yeah. Yeah. That'll fit. It's fantastic. And so it's just, I mean, I'm, you know, I'm not trying
to talk it up, but you are, you should, I know it saved me so much. Don't feel bad about it,
man. I know I'm with you. Yeah. We're simpatico. All right, man. Because it saved me so much time, so much time from not going to Google and not wasting my time on Reddit and not going somewhere else and getting advertised to.
It saved me a lot of time.
I mean, even simple things.
My wife and I started saying, give me interesting pizza recipes.
Because we were like, hey, we make pizza every week.
And we have the same kind of pizzas. but we're like, let's branch out.
Give me interesting pizzas, you know, whatever. Obviously it's going to go to the web. It's going to look, but it's going to find all of them, you know, spits them out. There's one that's
barbecue cauliflower. And we're like, yeah, that's kind of interesting. Yeah. Pizza the other day.
And it was fantastic. You know, and it's, and it's kind of like,
Hey,
tell me more,
you know,
tell me more about this recipe.
And then they give you the whole entire recipe.
So,
and,
uh,
you know,
I've been,
I'm to the point now where I turn on voice and I've gotten kind of bad with it a little bit late.
Not bad,
but I've been relying on it a lot while I'm working on stuff.
I can just be like,
Hey,
a perfect example for me.
We were talking about the UI of cameras earlier.
I cannot find menu items in a camera.
If you have a Sony camera, you can relate.
So many pages.
Where's the format?
Where is the image format?
Like, I have no idea.
So, you know, I'll have voice on sometimes and I'll just be like, hey, can you tell me how to get to, you know, this setting in the camera?
Here's the camera hand.
Sure.
Go here, here and here.
Awesome.
You know?
And then I'm like, if I change this setting, is this going to affect the frame rate?
No, it's not.
You know what I mean?
And so I will have a pretty in-depth conversation about one thing I'm working on, you know, with an LLM.
And it's, it's so great.
It's so great for the things you don't understand or, or, you know, uh,
I I'm, I'm, I'm a huge fan. I'm huge. And I talked to it, you know,
I talked to it cause it's, it's, it's faster than typing.
Have you heard it be called a word calculator yet?
Word calculator? No, not yet.
That's what I call it. That's what we call it around here.
It's become the way I think about it is it's a word calculator.
Yeah.
Right?
Yeah, it's guessing the percentage of what, yeah, it's trying to determine what to say based on percentage of what it knows.
Well, the same way you use a calculator, you're trying to figure something out, right? But it calculates with, it's a word. It uses words. It uses understanding, reasoning even, you know, in the latest models.
I think of it like that.
It's like, okay, here is, I'll just paste in a bunch of stuff.
Like, tell me all the numbers in here and add them up.
Yeah.
Right?
Yeah.
Like, if I copied, you know, 15 lines from my bank statement, for example, online, and I'm like, I want to know what these add up to, but the copy and paste on my Bank of America web UI is just terrible.
Yeah. Right? And I want to pull that in a sublime text and, pull out all the like i'm not gonna waste my time with that i'll i'll throw it into a gpt there's no information there that's really
you know shareable i would much rather do it locally yeah given all the things we've just
talked about but like hey you see all these numbers here after the dollar sign, those are all figures. I want you to add up and, and D dupe and tell me the right answer. It's going to do it.
Yeah. In a second. You know? And that's what I mean by word calculator. You could probably gush
about AI for forever. Let's not, let's not do that. Let's talk about two more things.
Especially if you have time, do you have more time? Yeah. Dude, I have all day, man.
I'm yours.
Sweet.
Let's go deep then.
Okay.
So let's close with, I think I would like to see some AI builds from you.
I'd like to see low-tier, mid-tier, high-tier AI home lab builds that might give people a gateway into this world, right?
Yeah.
That'd be cool.
You know, might be hard.
I think that's really up your alley.
That's where I'll leave that at.
Let's talk about the Creator PC.
And so you mentioned the dev workstation, which I watched your video on that.
So you built a Linux workstation the ultimate Linux workstation
78,000 views
I think people are like resonating with what you're saying here
I mentioned I'm resonating with building a creator PC
I kind of just want to like
I just want to build another machine
and I don't have a need for another machine
but I got this itch
I want to build another PC from scratch. I love it. It's just
so much fun, honestly. But I don't want to put Windows on it. And I don't think Windows, or sorry,
I don't think the year of Linux desktop is here for video editors, audio editors. It may be for
developers. It's here and it has been here forever. So my PC building, my creator PC building has been,
you know, stifled until I can figure out how to put Mac on it,
which Mac OS on it.
Not going to happen.
No.
Apple.
If you're listening,
somebody at Apple,
if you're listening,
I would love it.
If you can just make it so that if you love open source,
I think you might make it so that Mac OS can live on,
you know,
a PC that is built for Linux.
Treat it like Linux.
You know, I want to build your hardware too, but gosh, I want to build my own hardware sometimes.
And I don't want to give up Mac OS.
No, you got to pay for the dongle, which is the whole machine.
That's right.
Or the RAM.
You see that speculation about the Mac Mini basically being free, like the lowest tier basically being free?
Because if you, I think, added a couple of things to it, it's like double the price.
Yeah. Yeah. Even change the storage. Yeah.
That's interesting. Yeah. So creator PC, what would you do here? What are your thoughts on this?
Yeah. So creator PC. So it's, so to me, I think, okay, uh, gonna do video editing,
video capturing, you know, might do Photoshop edit, edit you know might do raster graphics or whatever
vector uh but i mean it's the same thing it's the same thing i think of for gaming pc and it's the
same thing i think of for you know an ai machine uh it's all gonna boil down to your video card
so i mean your first choice most people's first choice is going to be cpu right and so you have
two choices i think most people right now are going going to be CPU, right? And so you have two choices.
I think most people right now are going to pick AMD.
I picked Intel and that was my choice in that video because I wanted a couple of things.
One, I wanted QuickSync because QuickSync is awesome for transcoding.
And when I say transcoding, I don't mean just for Plex or anything like that, but you're able to take advantage of that on Windows
while you're editing machines or decoding, encoding.
So the CPU can go either way.
You can't go wrong right now.
But a lot of people are leaning towards AMD
because Intel has kind of been, I don't know, all over the place.
But I did choose Intel.
I think that-
The Core Ultra.
Yeah, the Core Ultra I think is great for everything but gaming.
And let me say this, like, it's still great for gaming,
but if you're going to build a gaming PC,
you might as well go with AMD right now.
But it's great for multitasking.
Low heat, low power.
Like, Intel kind of turned it around with the CPU,
even though it's doing poorly.
It uses less power, less heat, whatever whatever but you're going to choose your cpu
ram uh it's going to be you know ddr5 don't choose four like a lot of people want to choose four
because it's cheaper don't you know it's if you're building anything new today five so that means you
got to find a motherboard that has the right socket and ddr5 you know which pretty easy to find those. I'm an Asus fan. I'm a big fan of Asus motherboards.
So I usually choose from Asus. I'm like, what do they have that I can buy? It's not like,
let me look at all motherboards. It's similar to when I'm brand loyal to stuff, that's what I do.
Like Samsung flash drives, that's all I'll ever buy. So I'm like, what does Samsung have right
now? Same with TVs. I'm brand loyal to Samsung. What TVs does Samsung have? It's not just
like what's on sale, what, you know? So anyways, I'm, I'm the same way with, with motherboards.
It's always going to be for me, Asus. Um, and I doesn't mean I'll never buy something else,
but that's what I prefer. I, I typically on that board, um, I will look for a chipset possibly that's Intel,
uh, that has an Intel Nick on it. Uh, mainly because those play with Linux a lot better
than real tech chipsets. So, I mean, if you, if you are thinking about it, uh, yeah, definitely
look for one with an Intel Nick built into the board. That's the way I look at it because it does better with Linux.
And then you're going to want 2.5 gig networking on there.
Honestly, if you're building a creator machine, you're going to want 10 gig.
2.5 is borderline pretty good.
I mean, you could probably edit 4K with maybe a couple of stutters, but you want to look for 10 gig.
And if you can't get it on board, you know, you could do aftermarket parts and add it there.
So that leaves, you know, that's a lot of requirements up front.
I know that Asus has their, I don't know, art, media art, creator art piece, pro art.
I knew art was in the word, but they have those that are dedicated.
They say to creators, but really what it means is a lot of a lot of space
for fast storage fast network cards and we support the latest ram and cpus yeah and so you know
honestly i would say like i i know you're against windows i i run everything like i'm not against it
i'm just sad that's the only option i'm just like i so i play with it before i did my ai home lab
on my same nut because it's the only extra machine i have so I play with it. Before I did my AI home lab on my same nut, because it's
the only extra machine I have to play with something
on bare metal. So I install
Windows 11, and
thankfully they let you install it
and play with it for free. They don't make you have to
have a license. Now, you can't change the
desktop and do, there's limitations
to what you can do, and I think they nag you a little bit.
But then I was like, I mean,
there's a lot of extra software on there. It it's it's not the worst ever yeah I just think like gosh if I'm
going to build a creator pc I've already got so many efficiencies and workflows in the mac os world
software I've bought things that I just can't live without like raycast I know is something I use on
mac I think they have a plan for a Windows client,
but they're not there yet to my knowledge.
Maybe they are.
I don't know for sure.
But I'm thinking like,
what do I have to give up to move to this different world?
Okay, well, Adobe Creative Suite,
at least my license there doesn't limit me to platform.
I can use Windows or Mac.
So that's cool.
But then like, man,
everything else is just like these weird hoops.
And then Windows seems to be,
I think user hostile,
like absolutely user hostile.
Like to go and have fun and build and spend the money on a really awesome creator PC
and have to install Windows to be a creator on that same machine
and have a user hostile operating system.
Not saying macOS is that much better, but it is.
It's that much better hostility wise.
It's at least not doing all sorts of crazy stuff
like Windows does and like AI-ing everything.
Like I think Apple intelligence might be
the next frontier for them
and hopefully they don't push that button too hard.
They're sort of backbiting a little bit.
It's already here.
It's on my Mac studio.
I mean, you could tell me what the experience is like.
I don't have that, you know, but I'm hoping that they don't, you know, Apple intelligence
me too much.
You know, I feel like windows is just user hostile.
Really?
Yeah.
I mean, they have a long history of like introducing features that people don't want.
They say they don't want, and then they prove they don't want because then they get taken
away.
And then at the same time, uh, taking away features that people still want because they think that's what's best for the user.
And it ends up not being what's best for the user.
And so they backpedal start menu, you know, classic example.
So, yeah, it's I don't know.
Personally, I use Mac, Windows, Linux.
Like I have Windows here, too.
I actually enjoy Windows after you. After it's installed,
you strip out the stuff, you make it exactly the way you want, and then you get WSL running on it.
So Windows Subsystem for Linux. I feel like at that point, it's almost everything that I need
outside of iMessenger or Messenger, right? Outside of there, like being able to text people while I'm
on my machine, it does everything.
Like the WSL side lets me run Linux in a terminal that lets me to do, you know, basically I'm in a terminal for Linux and I can do everything I want to do.
Don't even have to think about PowerShell or anything like that.
You know, I can be a developer and run developer tools.
But then if I want to launch, you know, Adobe, whatever, Create-A-Suite, boom, I'm there.
I'm editing.
It's working great.
Windows drivers with NVIDIA or even QuickSync or whatever works fantastic, right?
Because they're getting drivers and iterations so much faster because just the volume of people.
Then if I want to launch a game, it's right there, right?
Launch a game, boom, I'm in a game.
The other thing is like compatibility with hardware
like you cannot you cannot match windows compatibility with hardware like it works
with anything you can plug into it and most things are built to work with windows and again like i'm
this is brought to you by a mac right now and uh you know i've had to jump through a lot of hoops
because of apple you know because dongles because no PCI express, because, because whatever, whatever the reason. And so, you know, the things
that you think are getting taken away from you, imagine what people lose when they go to a Mac.
So I would flip that around and say, you lose so much freedom as a, as a tinker, as a builder, as a custom rig builder, as a gamer, as whatever
going to Mac, you lose so much, really lose a lot. Oh yeah. Oh yeah. I mean, I can't even plug
in a PCI express card, right? I mean, Mac doesn't allow that. They don't allow any video card. You
got to use theirs. And so what I've had to do is get this usbc dongle that powers this thing
that allows me to plug in you know pci express cards so i can capture video and you can see me
right now that's what i do i mean that's i mean that's the lengths people have to go to to get
things to work that just work in windows you get them on the board you get pci express you plug the
card in boom it works drivers are already there so i I've gone both ways like I use both and I flip-flop on both all the time so I
it was honestly harder for me to go to Mac than to than to go to Windows obviously probably because
I ran it for 20 years but making that switch I realize you know, Mac is great, stable, fantastic.
I never have to worry about it waking up or it staying on.
Like, it just works.
Like, apps are so stable all the time, with the exception of editing every now and then.
And I don't know.
I kind of have a beef with Apple's biggest release, their software release with everything.
I feel like every single thing I use from Apple right now has a bug and we just need to get past this point. But like, you know, I going to Mac, I kind of,
you know, sometimes miss windows if that's the thing for me, it is. I sometimes I do miss windows
to be able to just, I don't know, do whatever I want because windows works with everything.
So free. Yeah. Yeah. Well, you're encouraging me to bite the bullet so they might say, you know, just because, you know, I, so we're, I'm running a, to give some context to why I'm in this struggle is one, I like to build machines.
And two, I'm running a, an M1 Max machine that literally is maxed out 64 gigs. It's the M one
max. It's the initial M series MacBook pro. Uh, I've got four terabytes of onboard storage just
because we wanted to max things out when we, when we purchased these, uh, I think back in 2021,
I don't even know. Right. And so, you know, I'm obviously been a Mac user for a while
and I, I feel like I want the freedom to build my own machine. I want the freedom to choose my
own video card. I want the freedom to choose AMD versus Intel. I want the freedom to choose
DDR five and not spend $10,000 on Apple RAM.
You know, I'm being facetious there,
but like it's expensive, right?
RAM is expensive, storage expensive.
And so you can go to Samsung and get, you know,
what are the 900 series, the NVMEs, what are those?
Yeah, 980s.
The 980s, yeah.
Like the Samsung 980, M2, and VME. Super fast.
You can get those so much cheaper than you would to even try and double your storage on a Mac build.
It's very expensive.
You could put 10 of them in a machine if you wanted to.
Precisely.
So you pay this Apple tax.
And the tax is, to some degree, simplicity in the fact that it just works.
It's a pretty stable system.
I really haven't had a lot of issues, but you've got to pay that dollar tax.
And then, obviously, my family is an iMessage family.
So I've got to have that somewhere.
I can't just have it on my phone.
I've got to have it on my desktop, too.
Who wants to text only on their phone?
Forget that.
That's a terrible world. I live that world my desktop too. Who wants to like text only on their phone? Forget that. That's a terrible world.
Yeah, I lived that world for a while. And that's why I was saying like,
if I ever left Mac and went 100% back to Windows,
that's the only thing I'd miss, to be honest.
That was the only thing that I'd miss.
I mean, notes too.
Now that I'm in the ecosystem, I'm like,
yeah, notes are super easy.
They're right here.
You know, I'll type a note in here. Photos, man. I referenced photos on my desktop frequently.
The syncing between photos app on my phone to the desktop is just, you know, and then there's
the other tax there, right? You get your iCloud tax, right? And, you know, I don't know about you.
I like to back up my photos, but then I also like the Apple cloud for those photos too, because I share a lot of photos with my wife. We got kids and we just have
history. We love to look back at photos that are five, 10 years old, you know, frequently,
because one of like, as a dad, one of the things I do with my kids is we will, you know, we'll do
story time at night, but we'll also look through some photos of things we've done like a year ago
or two years ago, or have a memory with somebody. And like for kids, that's grounding, right? That's,
that's, that's their identity. Who are we? Why do you love me? Dad, it's a reminder. Like,
I know they know I love them, but like, it's a reminder of the fun things we did just because
we didn't get to do something fun this week. Doesn't mean we haven't done it before. And it's
just this remembrance of where they've been loved ones that are not here anymore. Or someone we won't get to see too frequently, just reminding them how much they matter to them. And it's just this remembrance of where they've been, loved ones that are not here anymore, or someone we don't get to see too frequently, just reminding them how much they matter to them.
And that's how we use photos, not just obviously for B-roll like you're doing.
You know, it's more than that.
You know, we live in it.
It's a life thing for us.
So maybe I just need to be a Windows and Mac family.
You know, maybe it's like, why one or the other?
Why not both?
That's what I say.
Like, why not both?
Exactly that.
Because I know a lot of people who use a Mac all day, all day.
I mean, me personally.
I used to be Windows at home, Mac at work.
No, sorry.
Other way around.
Mac at home, Windows at work.
That's how it was for a long time as a
developer. Every enterprise had Windows, don't bring those Macs in here. And then it flip-flopped.
Then it got to be, well, now I'm a developer, they just hand me a Mac, so I'm using a Mac at work.
Well, I'm going to use Windows at home because I game. And so it can be like that if you think of if you think of the computer you're
using like a tool you know and kind of what it is it's hard to think of it like a tool because it's
so versatile but if you say here's my editing machine and here's my everything else machine
then it kind of might make sense uh but i mean i mean what you're still going to have a MacBook or a laptop, right?
And so you'll have your laptop plus, you know, a workstation. It is hard going that way,
but I honestly think it's hard going the other way because you give up so much.
Yeah. You give up so much to, to, to move to Apple.
Well, friends, I have a question for you.
How much of your personal information, your private data,
how much of that is out there on the internet right now for anyone to see?
I think it's more than you think.
Your name, your contact info, maybe even your social security number,
your home address, potentially even information about your family.
It's all being compiled by data brokers and it's being sold online. And these data brokers, they make a profit off your data. They sell it as a commodity.
They don't care. Anyone on the web can buy your private details. And this can lead to identity
theft, phishing attempts, harassment, unwanted spam calls. I get those so much. But now you're
able to protect your privacy with Delete.me.
That's today's sponsor.
I recently found Delete.me and they sponsored the podcast and they offer a subscription service that removes your personal info
from hundreds of data sources online.
Here's how it works.
You sign up and you provide Delete.me with exactly the information you want deleted
and their experts go and take it from there.
They send you regular personalized privacy reports
showing you what information they found, where they found it, and what it from there. They send you regular personalized privacy reports showing you what information they found,
where they found it, and what they've removed.
And it's not just a one-time service.
Delete.me is always working for you,
constantly monitoring and removing the personal information
that you don't want on the internet.
To put it simply, Delete.me does all the hard work
of wiping your data, your family's data,
from data broker websites.
So take control of your data and keep your private life private by signing up for delete me today.
Now at a special discount for our listeners today, you get 20% off your delete me plan by texting
change log to six four zero zero zero. Once again, text change logELOG to 64000. And as you may know, message and data rates may apply.
See terms for details. Enjoy.
Humor me. Let's build a creator PC. I know you just shared your video and I know you're running
Linux on it. You're not running Windows on it, but that's okay because the build itself, the hardware itself is probably very similar. So let me tell you what I
would like to build and let's compare it to what choices you've made and why. And this is not an
exhaustive of all the components. It's the core things, you know, it's the case, it's the
motherboard, it's the CPU, it's the cooler, things like that. You can go into Ram. I think that's
pretty, you may have opinions about which brand, maybe it's cheapest, whatever. Obviously you're a
Samsung lover on the NVMe storage section, but the case I would like to use is the ProArt PA602.
Yeah. ProArt. That's what I was saying.
The motherboard, ProArt again, ProArt Z790 cpu so far as the intel 14 900k 14 gen you're going with last
last gen so i okay so that i didn't even know is that last gen i'm not uh i'm not like you tim
i'm not on the i'm not that's why you're here man keep me on the edge no it's totally fine but
yeah 14 is ultra the new hotness then it is okay it Okay. It's their latest. I wasn't sure what Ultra was.
It was new.
I was like, who is that for?
Okay, so Ultra is the new hotness.
It's new, but not really new because they launched it on laptops a while ago, but now they launched it on desktops.
And it's super confusing because you'll see like, you know, laptop reference with Core Ultra, but then desktop processors now are out that are Core Ultra.
So I don't know.
There's a lot of cross-checking before you buy
stuff because you don't want to buy the wrong thing so that's one thing to look into but yeah
they're great uh run hot run power hungry uh then there was all those problems with them but
that's all fixed in microcode now and firmware updates so okay okay so i'll uh jury's out then
on the cpu i'll take some advice maybe the 1400k is is the old
hotness and i need to do the hotness i need a new hotness who knows you know uh cooler i was advised
on the arctic liquid freezer 3 420 420 millimeter uh you know all that good stuff it's that's the
biggest thing you can put into this and i think it fits in the pro art case as well.
Uh,
thermal take tough power,
1200 Watts,
80 plus platinum.
And then the GPU,
I mean,
aside from the conversation we had earlier about AI home lab stuff,
I think GPU is just hard to find.
And so I'll pay through the nose for this.
If I can get this one,
but a 4090,
I can't buy a 5090.
Cause like it's one,
it's not available. And two, it's probably just like five seven ten thousand dollars because of you know
scalpers or whatever so I'm just you know the jury is out on the 5090 but I do want the 90 series of
the 30 or the 40 and so I was thinking the MSI gaming GeForce RTX 4090 I could also go with the Asus Tough version of that,
the RTX 4090 there, or the 3090.
I think the 4090 and 3090 kind of compare pretty well.
So if it's availability and price,
maybe the 3090 Tough Edition from Asus
or this MSI GeForce RTX 4090 if I can find it.
But that's going to be pricey.
It's going to be like $2,000, $2,500.
And so that's what I'm saying. If I's going to be like two grand, 2,500 bucks.
And so that's what I'm saying.
Like,
if I have to spend this much money to build
for fun,
this machine,
I got to put Windows on it.
Okay,
fine.
You've made me think
that maybe there's a world
I can live in
that has both
Mac and Windows
in my life.
That's the rough
of what I'd like to build.
I wouldn't mind AMD. I do have an
AMD AI home lab, but your video on your Linux workstation made me think maybe my AI home lab
should be this creator PC workstation. Maybe I can blend the worlds. I don't know. What do you
think? Oh yeah. It absolutely could because you could use that GPU for Ollama or for anything else while you're not using it
or even while you're using it, right? Because it's just going to use the CUDA cores if you're
using NVIDIA. And so it's going to use CUDA cores and VRAM. But if you're typing a document,
you don't need that. You don't need any of that. So yeah, you could do both. You could
have it run both and then you could keep the AI local.
And I think there's desktop applications that you could just install it and do it all local, local.
Not even on a server in your home, but on the machine you're using.
So yeah, that would totally work.
It'd totally work.
3090, yeah.
I have a 3090.
I got pretty lucky because right before the pandemic pandemic or right as it started, that launched.
And everybody wanted the 3070 and 3080, I think.
And I wanted the 3080.
But it just so happened like Best Buy had one of those in stock, the 3090.
And I thought, oh my gosh, like I'm going to spend like $1,000 on a GPU. Turned out to be like the best purchase ever because you couldn't find them after
that. And I used it all through the pandemic for all of my videos and everything. It was, it was
great. It was great. But I still think they're solid. I like the founder's edition. Like if
you're going to go, if you're going to buy something, I feel like the founder's edition
from directly from NVIDIA.
Just the design and everything is fantastic.
I understand why people don't choose that, but that's just me, my personal preference.
Yeah.
Yeah.
This world of GPUs is hard to understand as an outsider coming in.
Founders Edition, OC, Tough Gaming.
I know these are sub-brands of certain brands then you've got MSI and you've got you know
EVGA I believe is another
prominent brand that you can get and I think that might be
availability maybe pushing that brand because
MSI or somebody else might not be
available, I don't know, it just, from the outsider
coming in, someone who's never been a
gamer, I've never been a gamer, I've never built a
a PC to build
to game on. So
building even a machine to utilize a dedicated GPU is foreign to me. I've never done it. So
even selecting which GPU or having this history, thankfully, we've got people like you out there
and my other good friend, I don't know his name. I can't remember his name. Tech notice. Great dude.
He's always on it with like the latest.
And he's strictly creator PC guy.
Like he's not gamer PC guy.
Now he will talk about how it may influence or not influence if you're a gamer too.
But he's primarily giving advice generally on the tip of technology.
And he's got lots of videos out there.
So he's covered most of everything really from RAM to storage to whatever.
And it's all from a creator PC or a creator lens,
not a gamer lens.
Not that that's a bad thing,
but like a lot of people are trying to build their best possible rig for creating because the Mac has limitations or they really want to push the
boundaries or they need,
you know,
more than one GPU for whatever reason.
Like for me,
if I really had unlimited funds,
I would love to build an AMD Ryzen Threadripper Pro machine
that has a workstation-level motherboard,
tons of PCI lanes,
and I would love to have multiple, if not four.
I think once you get to five, it's kind of hard,
but especially on power and cooling. But at least two, maybe four GPUs.
Like if this AI theory, I know you're laughing.
If this AI theory plays out, I feel like, wow, I can build this rig.
It might be expensive, but long-term it might play out because I think AI will become the centerpiece of home labbers here soon enough. If, you know, models become, as we've talked about before,
remain or become more open source or available on Olam, and Olam becomes a first-class citizen
when it comes to integrations for platforms. I think if it's like, I integrate with ChatGPT
and Olam, if that becomes a real thing,
you know, well then that kind of machine will pay its dividends over time as AI becomes more and
more advanced. And as we allow it to inject itself more and more into this private world, we have
into our home lab world. So, I mean, I probably wouldn't build an AMD threadripper pro machine
for my own personal creation, like creator level, I think the
Intel Core Ultra or the Intel 14900K would be just great in that world. But AMD has some
compelling things about it. It seems Threadripper, that's a cool name, right? Threadripper's cool.
Ever since I heard it, the first time I'm like, heck yeah, I want to rip some threads, man.
Yeah. Who doesn't want to rip threads? Yeah. So yeah, you touched on a good time. I'm like, heck yeah, I want to rip some threads, man. I want that. Yeah. Who doesn't want to rip threads?
Yeah.
So yeah, you touched on a good point.
And I'm glad you mentioned that because it totally oversight on my part.
Something that I always think about when building a machine is PCI Express lanes. And so if you're going for a creator machine, yeah, there is a huge market, I think, for
workstation level machines and thread rippers, what we have from AMD, I don't know what Intel did.
They had one.
Now they don't have one.
And who knows what's going on.
But workstation-level machines, I think, should be a focus for, I think, both platforms.
Mainly because you're limited in PCI Express lanes.
I think on Intel, now you get 24.
So they caught up with AMD.
But what does that mean in reality?
It means you put a video card and you get two NVMe drives.
That's not enough for most people, especially creators.
And it used to be 20.
So you'd put a video card in there.
That's 16 lanes. Then you'd put an NVMe drive in there. That's 16 lanes. Then you'd put an NVMe
drive in there. That's four lanes. And you're
maxed out. As soon as you build a PC, you're
maxed out. So that's something to keep in mind, too.
Any desktop class processor
you go with AMD or Intel,
you're going to be limited to
24 lanes. I think they're both 24 now.
And so that means
a video
card and two NVMe mvme drives you could
bifurcate those though right like you can drop it down to eight lanes you're just like less bandwidth
i think it's not speed it's bandwidth and that what it is well bifurcation yeah you you can do
um it's really just dividing up the lanes so yeah you, you could maybe turn that 16 into two eights, right. Or maybe
even go down to, you know, four fours, you know, to get the 16, depending on the cars, depending
on the motherboard. Like there's a, there's a big if in there, but at the end of the day,
you're still getting a total number of 24. Right. And so, um, your video card is going to take 16
and, you know, NVMe drive, one of them, your os probably is going to take one four of that so
then you're left over with another four for maybe your media and then it means whatever else you
want to plug in there yeah it's going to be shared or who knows maybe not work most likely be shared
but but uh and so i yeah i i don't know i I feel like gaming influenced this whole thing to why we don't have PCI Express lanes.
I kind of feel like it.
I don't know.
This is my theory is that manufacturers saw, hey, most people just want to put a video card in and one NVMe drive and call it a day.
And so I think motherboard manufacturers started seeing that and they're like, OK, well, we're going to chop our motherboards down and make them smaller.
We're just going to give you two slots, you know, and then case manufacturers saw that and they're like, OK, we're going to make our cases smaller, you know.
And then, you know, CPUs, they were trying to see how many lanes the minimum they could get away with.
And they're like, 20 sounds good.
I don't know.
I feel like they optimized for the wrong thing.
They didn't optimize for me.
And maybe I'm the outlier.
Maybe we're the outliers, right?
Like we want PCI Express lanes.
We want to plug in add-in cards.
That's the whole point, right?
Yeah.
I mean, yeah.
I want to nick.
Maybe my motherboard comes with a decent one.
That's okay.
But if it doesn't. Yeah, you're going to throw 10 gig. If they're not pushing 10 gig or more, I. I want a NIC. Maybe my motherboard comes with a decent one. That's okay. But if it doesn't,
if they're not pushing 10 gig or more,
I want to put a card in.
Maybe I want to do an HBA
because I want to have,
maybe I'm building an NAS.
Maybe I also want a GPU in there too.
And I want room for it.
Not just ability to put it in there for the lanes,
but like, give me some room.
You know?
I think you're going to want to do some things like that.
And we're not the outliers here.
I think that, I don't know, maybe,
well, everything that is GP related,
even motherboards lately until pro art
and things like this,
they started to push creators, not game.
Like almost everything is,
and I know you love RGB.
I know you do.
But you're also a gamer too.
So you got that in your blood.
I'm not a, I love games,
but I've never been a PC gamer.
I like to play Nintendo Switch with my kids.
We love, you know, Mario Party.
It's all the rage in our house.
We love that.
Mario Kart, of course, too.
But, you know, gamers really influenced, I think, PC builds
because everything's gamer edition.
And it's all gaming influence.
It's all pushing what can happen in gaming.
But I think you need that though, right?
You need some sort of killer application
or killer thing to happen.
I think what happened with Ollama
is DeepSeq was the killer app for Ollama
was let's make Ollama more useful.
Well, now we have a model that's comparative to others.
But in the PC world,
gamers kind of push that world for a while.
Like you don't have anybody who has to have access to cloud docs and stuff like that needing a GPU.
That workstation, it's not your everyday person.
Maybe you have somebody who's got a spreadsheet-itis, and they've got spreadsheets out the wazoo, and they need a better CPU for that.
But that's the limit. know they're not pushing gpu stuff so all of that pci lin express our pci express
lane uh innovation and gpu innovation was happening because gamers were pushing the innovation really
yeah yeah absolutely yeah to get the fastest machine to get the best frame rates to get the
lowest latency yeah it pushed a lot of things forward.
But I still, you know, I also feel like, man, that kind of like pigeonholed what a machine is possible of.
And everyone optimized probably for cost to get less lanes, smaller motherboards, you know.
I mean, it's been happening over time for a long time.
But yeah, it's crazy to think like everything depends on the GPU now too.
It's like, you know, no matter what you want to do. I mean, you know, when you think of AI,
when you think of, you know, Bitcoin, when you think of video rendering, you know, creation,
it's, it's like everything depends on the video card. So, and everyone's competing for them, at least with the 5080, I think. The 5080 was, you know, what NVIDIA did
was basically say, no, this one's specifically for gaming. They basically, you know, gave it
gaming performance like the previous generation for the cost, I think, less than the previous
generation, but without the CUDA cores for the ML and AI. So I think, I mean, I don't know if it's the right thing
to do, but it was a smart thing to do
for NVIDIA to say, nah,
this one's for gaming.
If you're going to do AI, this is not the one
to get. So I think that kind of
segmented their audience. I wish
it would go wider. I wish it would go wider.
Say, hey, here's the creator edition.
Here's the AI edition
or Bitcoin edition, whatever. Call it whatever you want. And here's the gamer edition. And then optimize for those things. Honestly, it's hard to discern between media creator and gamer because, I mean, really, you just need, you know, GPU. You need that encoder, right? You don't need a 3D aspect, but you need that encoder and you need video RAM, you know, GPU, you need that encoder, right? You don't need a 3D aspect, but you need that
encoder and you need video RAM, you know? And so you don't need any, well, the funny thing is you,
you still need some AI capabilities, even for a creation today. You know, if you think of Photoshop,
when you say like remove or blur or anything like that, I mean, they've been doing it for a long
time, right? And so that's offloaded to your GPUpu if you have one you know is to be able to do gaussian blurs or blurs or even replace or
you know auto select this person or cut this person out of the photo so you know those are
still needed even even uh even outside of video if you're just doing you know you know static art but
yeah i i don't know i i wish they would i don't know yeah, I, I, I don't know.
I wish they would.
I don't know.
I don't know what the,
I don't know.
I don't know.
Yeah.
I don't know. I think my pushback would be,
or my response would be on that would be,
I think,
especially with the GPU scenario,
literally at the creation of the hardware,
there's a limitation.
From what I understand,
I can't recall the company's name, but there's one particular company that can do the fabrication of the hardware, there's a limitation. From what I understand, I can't recall the company's name,
but there's one particular company that can do the fabrication of the,
it's not a CPU, it's like the chip on the GPU card.
And forgive my lack of familiarity because I really haven't played
with much GPUs, honestly.
But from my understanding, there's a limitation on that
because there's such a demand and there's only one tried and true company who can do this well, and there's a limitation on that because there's such a demand and there's only one, you know,
tried and true company who can do this well.
And there's a bottleneck.
And so maybe the lack of skews,
which is kind of what you're hinting at is like,
you want a gamer edition,
you want an AI edition,
a creator edition,
the lack of skews might be,
especially now as the pressure on the ability to crank these things out.
That might be it.
I would think like maybe you just need
levels like they have like the 70, the 80
and the 90. It's like well the 70 is for
budgetary
less needs and the 90's got AI
it's got gaming, it's got creator in it.
I kind of feel like you can blend those worlds.
It's like almost tiers of
type. I want to do
gaming, I want to do AI and I want to do
creation stuff. So the 90 may be better
for you. And the 90 guarantees you'll have 24 gigs of VRAM or more, maybe you can add more to it.
And the 80s sort of put you in this VRAM scenario with certain technologies in it. And the 70s,
budgetarian, more limiting, but still quite capable for dedicated GPU scenarios. You know, I don't know. As an outsider who's just learning, that's my, that's maybe how I'd skew it.
You know, that's probably how I would think about it personally.
Yeah, no, that's a good way to put it.
It's just odd now, though, because this is the first time I think that I've seen where they're like,
yeah, the 80 is specifically for gaming because we're taking out a lot of the AI and ML stuff, you know,
which was, I think it was probably a good move on their part to say, nope, gamers, you're going to get this card.
You know, this is the one to get.
But yeah, it's interesting to see.
Yeah, it is for sure.
And honestly, on media creation, you could get by.
You could get by with a lot less than you think, a lot less than you think a lot less than you think and i will say like just doing
media creation a lot a lot of the times you don't even need to you don't even need to make like
transcode anything or convert anything until the end you know you just have to have good encoders
in a fast network uh to be able to process that video you know and so i edit in 4k i used to when i had a worse machine i
would create proxies this kind of insider term for transcoded down to a lower resolution so i can
edit at a lower resolution because my machine's not that strong um but nowadays it's like you can
get by with surprisingly very little for for content. There are some people who even just use the iGPU,
and that's enough to be able to do the encoding and decoding as they edit.
Yeah.
Yeah.
I'm going to link out to your building my ultimate Linux workstation video,
but because I said let's create a creator PC, give me your spec list.
Give me the motherboard, the CPU.
Give me a rough build list for this machine you built.
Oh, the one I built?
Mm-hmm.
Okay.
So, oh, man, I'd have to kind of look at my notes.
You don't know from heart?
Oh, man.
Come on, Tim.
Dude, that was like three, four weeks ago, five weeks ago.
Oh, that was like yesterday, basically.
I'm like on to five more things besides that.
And I'm terrible at specs, but I can tell you probably off the top of my head.
I know it's an Intel Core Ultra.
I think it's seven or five.
It's Core Ultra 5, not the latest.
And that was really just for budgetary constraints because i didn't see much use in
going with the top tier which is totally against what i've done my whole entire life i've always
been a buy an i7 i9 the max one figure it out later then i went with an asus motherboard i
couldn't tell you uh any the the model name but i went asus prime 780 7 890 py5 yeah yeah sorry i got your video up i got your
back probably pull it up too uh corsair ram i know i went with corsair ram uh ddr5 i think it was i
don't remember the speed 7 000 maybe and now i'm getting uh making up numbers we'll use your videos
as a backlog and we don't have to be perfectly accurate i'm just curious like can we build a
creator like i should share my spec with you only because i had it listed and
i've been dreaming about it and thinking about it and deliberating and hemming and hawing as
people know that i do whenever i think about you know change or something new i write it down and
i you know i marinate for a while on a lot of choices that i make in my life and building a
multi-thousand dollar machine is not easy from the dollar point.
You know, so like,
I'm going to think about this thing for a while.
I'm going to survey my favorite creators.
You're one of them and see what their choices are
and compare and contrast.
And the only change I'm making personally
is this, you know, this Core Ultra consideration,
but maybe AMD.
So I thought maybe you could rattle off your dream list,
so to speak, for your workstation. Yeah. I mean, I kind of built it with that. I'd probably bump
it up if it's my dream list. I mean, if it's my dream list, it's a CPU that doesn't even exist
that, you know, I have a, you know, a workstation level processor, but, but for my Linux workstation,
yeah, it was a core ultra 5, which I think is great.
It's great for multitasking.
It's great for coding.
It's great for compiling.
It's great for the things I'm going to do as a developer, right?
Is it the best for gaming?
No, I think we talked about that earlier.
But it can still do it great.
It's just not the leader in that space anymore like they used to be.
But great for multitasking.
You know, it's DDR5, the fastest
DDR5 I can find. Motherboard to me, again, doesn't really matter. I generally don't want Wi-Fi and
Bluetooth on it, but it comes with every single one. You know, I need four slots for DDR5 and
it supports up to 192 gigabytes, which is such a weird number. And at the same time, now RAM kits come in weird numbers now to get to that 192.
Weird, weird times we're in.
Yeah, and then it's, you know, I already talked about this, but it's super fast NVMe drives.
For me, that's, you know, Samsung 980s.
It's the pros.
And then one's going to be for OS and one's going to be for everything else, you know.
And then I want 10 gig networking because I have a 10 gig network backbone, even though I don't even need it.
Like, honestly, if this is my dev workstation, I don't need it at all.
I'll stick with the 2.5 gigabit that comes on it.
And that'll be fine because I'm rarely going to transfer things, you know, to and from this machine.
You never saturate that.
No, it's for writing code, man.
And I mean, maybe models when i download stuff but no not even that because my my my home network is
gigabit sorry my isp is gigabit so i i don't i won't put any spinning hard drives in anything
i ever buy anymore except for my nas so that's off the table and even nas i'm kind of like still
questioning it, like,
why are we still using spinning drives?
I think,
uh, cause they're big.
I know,
but why they're big and less expensive than something else.
That's big and really expensive.
But I mean,
are we,
yeah,
I don't know.
I don't know.
When I kind of feel like,
is this,
is this people like,
is this real?
Are there big hard drives?
They're holding us back on purpose,
you think?
It's a conspiracy?
Big storage.
I don't know.
Big storage.
Big storage is out to get us.
I don't know.
You know, I mean, it's like, do we need to stay on spinning? I get it that there's more capacity, but could we make that more capacity on SSDs?
Is it possible? Yes, it's physically possible.
Is it cost effective? I don't know.
Maybe if we did it more, I don't know. But anyway, spinning drives, well, I heard from someone that
spinning drives will never go away because they'll always be more dense, right? And they'll always
have more capacity. But I feel like that doesn't always have to be true. But I don't know. Maybe
that's me not understanding flash and NAND flash and all that.
So what do you gain though from, I mean,
obviously there's challenges with rust disc because you got, you know,
vibrations can, you know, cause read, write errors.
You got lots of things that can happen,
but generally if you have a pretty good machine and a good build,
those aren't true challenges.
They can be challenges if you're not proactive and making them not challenges.
And if the density is always there and you really don't need,
maybe you need more than what is it? Six gigabit per second per disc.
Is that what that's usually what it is.
Like if you have a decent backbone, your PCI lanes,
then you're going to harp,
you'd be hard pressed to saturate that like in a lot of cases,
unless you're doing some major transfers and maybe your home lab is super
enterprise and maybe mine is less,
but like the main thing I'm moving on my network is Plex movies.
And it's usually when I rip it to the NAS and never again.
And then obviously whenever it comes off those discs to stream,
but I don't need that level of saturation.
So discs for me work. Yeah. You me it's uh it's heat it's
yeah noise power you know i i could do away with most of my fans like and you know they're they're
loud in general you know they make noise like my flash ss make zero noise. They give up almost zero heat.
You know, they take up a quarter of the space, you know.
And so because of that, like my NAS is, you know, for you because it needs to fit these drives.
And, you know, I don't know.
I wish we were just all SSD, all flash storage.
One day, Tim, one day.
I know.
I know.
But it's nothing more than me just like kind of wanting to be done with it.
And for those reasons, but they're efficient, they're large and they're cheap.
So what's left to say?
I got to go on like two minutes here.
I got a hard stop personally.
I'd love to keep just going deeper if we could.
We'll have to do this more frequently.
Something like that.
Who knows?
Maybe more than once a year.
I'd guest on Techno Tim Talks, but I don't think that's your style.
You don't do that there.
Do you have guests?
I don't know.
Yeah, I can absolutely have guests.
Yeah, I have before.
I'll geek out with you.
Yeah, it's usually on Twitch.
I mean, I've kind of been switching stuff up a little bit.
I don't know.
Twitch accidentally banned me twice.
I saw that.
Dude.
And I'm like...
The API.
Dude, yeah.
I'm kind of like over it in my head.
Because I'm like, really?
If I got to find...
That gave me the opportunity to stream on YouTube Live and realize the opportunity there.
Yeah.
And the audience there.
Well, you're already there.
So you can just tap
into your existing subscriber base.
Exactly, whereas Twitch, it's,
hey guys, come check out my Twitch.
That's why I don't hang out with you there, honestly.
I would probably at least lurk in your lives,
whereas I'm not gonna go to Twitch, personally.
I get it, if you don't type in that URL
and you don't go there or have someone to watch,
you just don't go there. It's just not in your routine ever. And so I totally get it. It's just like, that's kind of where I started. I started out live streaming before YouTube and it was on Twitch and it was playing games. So I just have a soft spot in my heart for it. Yeah. Well, I mean, you, you have done some cool stuff there, but you know,
whenever you disrupt somebody's normal habit and flow, you give them a reason to ponder change.
And sometimes that means the negotiating goes the opposite way and they leave your space. And so
maybe Twitch is in your past and YouTube lives are in your future. Um, but either way, I'd love
to talk to you more as it, as it makes sense sense I think we can geek out quite a bit about this stuff
I think it's fun
I think it's fun to just dig into it with somebody else
because
as you can tell I make my dream lists
and I ponder them myself
and I might pay attention to people
but I'm not having a conversation with anybody
really deeply about my choices
or why I'm making these choices
and it's just maybe after this conversation, I might be okay with having both windows and Mac in
my life.
Maybe.
I think you'll be okay.
I think you'll be okay.
And then, then, uh, maybe your kids one day we'll have a gaming machine.
They'll be like, yeah, no, I get a gaming machine.
Dad gave me the hammy dad.
This got the 40 90 in there, Dad.
Oh, my gosh.
Yeah.
The 3090.
What did you do, Dad?
The 3090, really?
You couldn't get the 4090?
Well, son, let me tell you what happened, okay?
Yeah.
AI changed everything, okay?
And GPUs were hard to find and super expensive.
Yeah.
Back in my day, we didn't have this AI thing like stealing all of our GPUs.
It was Bitcoin.
That's right.
That's right.
Well, Tim, it's been fun geeking out with you, man.
Thank you for hanging out for a bit.
Anything left?
Anything else?
No.
Any self-promotion?
Any plugs?
Anything?
I mean, no.
I'll link it all up for you.
Don't you worry.
Oh, yeah.
Thank you.
No, I appreciate being here.
I appreciate the time to talk about it.
You know,
I say this on,
on my live stream on Twitch,
like,
you know,
I,
I rarely get to talk about this kind of thing to people in real life.
You know,
it's,
it's,
it's either on my live show or to people in chat.
And so it's,
it's nice to be able to talk to someone who understands what I'm talking about.
And so I appreciate it,
man.
A real human. I'm not AI. If you thought I was, I'm not. This is real. I'm the real.
Yeah. I appreciate it, man. All right. Bye friends.
Bye friends.
Well, friends, as you may know, and you heard it in the show, we are now shipping
full length episodes of our podcasts with bonus visuals, chapters and all to YouTube.
And I've personally been enjoying it because I have been watching our shows on YouTube, not just listening.
And I encourage you to do that to YouTube dot com slash changelog.
Subscribe. News on Monday is their interviews midweek.
And of course, this show chang, log on friends on Fridays.
Some would say it's the best thing ever.
Well,
speaking of that,
it's better.
Yes.
Change law.com slash plus plus bonus content.
Drop the ads directly support us and the joy of receiving a sticker pack in
your inbox,
your real inbox,
your mailbox.
Again, change law.com slash plus plus,
10 bucks a month, 100 bucks a year.
It's better.
And we appreciate all the support.
And of course, a big thank you to our friends
and our partners over at Fly for supporting us.
Fly.io, the home of changelog.com.
And to our amazing friends over at Retool,
retool.com and Tem our amazing friends over at retool, retool.com
and temporal,
temporal.io
and of course,
our friends at Delete Me.
Make sure you text
changelaw to
64000
get 20% off
and that's awesome.
And to the beat freak
in residence,
Breakmaster Cylinder.
Those beats are banging.
Love them every single week.
Thank you, BMC.
That's it.
The show's done.
We'll see you on Monday. Game on.