The Changelog: Software Development, Open Source - The GitHub problem (and other predictions) (Friends)

Episode Date: January 14, 2026

Mat Ryer is back and he brought his impromptu musical abilities with him! We discuss Rob Pike vs thankful AI, Microsoft's GitHub monopoly (and what it means for open source), and Tom Tunguz' 12 predic...tions for 2026: agent-first design, the rise of vector databases, and are we about to pay more for AI than people?!

Transcript
Discussion (0)
Starting point is 00:00:15 Welcome to ChangeLog and Friends, a weekly talk show about sleuths and sloths. But first, a big thank you to our partners at fly.io, the platform for devs who just want to ship. Build fast, run any code fearlessly at fly.com. Okay, let's talk. Well, friends, I don't know about you, but something bothers me about getting up actions. I love the fact that it's there. I love the fact that it's so ubiquitous. I love the fact that agents that do my coding for me believe.
Starting point is 00:00:52 that my CI CD workflow begins with drafting Tommel files for Gitab actions. That's great. It's all great. Until your builds start moving like molasses. Get up actions is slow. It's just the way it is. That's how it works. I'm sorry.
Starting point is 00:01:10 But I'm not sorry because our friends at Namespace, they fix that. Yes. We use namespace.so to do all of our builds so much faster. Namespace is like Gidabashions. but faster, I mean, like, way faster. It caches everything smartly. It cashes your dependencies, your Docker layers, your build artifacts, so your CI can run super fast.
Starting point is 00:01:33 You get shorter feedback loops, happy developers because we love our time, and you get fewer. I'll be back after this coffee and my build finishes. So that's not cool. The best part is it's drop in. It works right alongside your existing GitHub actions with almost zero config. It's a one line change. So you can speed up your builds, you can delight your team, and you can finally stop pretending that build time is focus time.
Starting point is 00:02:00 It's not. Learn more. Go to namespace.s.0. That's namespace.s.0, just like it sounds like it said. Go there, check them out. We use them, we love them, and you should too. Namespace.s.o. Are we, this is the live, this is it? This is the show. I was just saying, Happy New Year, yeah. I tell you what, I've missed it.
Starting point is 00:02:24 I've missed you all. It's been, yeah, it's been a long time. How have you been? How have you been? What was 2025 kind to you? Thus far, six days in, I'm feeling it. Today is a particularly good, but sad day in my house. There was a death this day, a very near and dear person to us. I don't be gloom about that during this podcast, but, you know, podcast-wise, very happy, you know, true Adam Hart, pretty sad today. Okay, sorry to hear that. Well, that way, you can hang out with you. your friends and we can cheer you up for a bit. They will later on with my Texas barbecue. Well, after we're done with this. Oh, yeah. Yeah, yeah, after this.
Starting point is 00:03:03 But really, though, I'm excited about this year. I think everybody says the best year yet. I think this is going to be the best year yet. Yeah. We've got to make it the best year, haven't we? You do, right? You have to do the work to have the best. I don't know.
Starting point is 00:03:17 1995 was pretty great. 95 was good. Windows 95 came out. Everyone was over the moon. They're loving their new menus. Friends was on TV. Friends was on every week. Chiching.
Starting point is 00:03:28 It's a good year in movies. Chandler Bing? I mean, like the best character in the world, rest and peace. Yeah. You mean Mrs. Chananler Bong? Oh my gosh. Yes. Deep cut.
Starting point is 00:03:39 I'm afraid the TV guide comes to Chanander Bob. It's Miss Chanander Bob. What was his name again? Matthew Perry. Matthew Perry. Matthew Perry. Matthew Perry is not one of my friends. Never was.
Starting point is 00:04:04 but I would have liked to have been his friend. But I do have other friends, yeah. Friends visited. Someone came from France to be my friend. Doesn't sound weird, but they visited over the holidays. To be your friend. Well, that's as far as I understood. They became your friend when they visited.
Starting point is 00:04:24 Well, became more of a friend, for sure. Again, that sounds suggestive, and it's not meant to. I know it's a family show. What are you suggesting? I don't know. They're deeper friends. Have you got any resolutions, anything you want to change and do differently, 2026? Oh, man.
Starting point is 00:04:41 That's a great question. Well, you're the guest, so why don't you tell us yours? What are you got going on? Oh, yeah. Good job, Jared. Deflect. First one, I'm really going to, I'm really going to try and stick to this one. I want to change the way that I write the year, change the numbers.
Starting point is 00:04:56 That's number one. I want to learn more keyboard combinations. Key mapping's changed on a keyboard recently because I've got this USB switch. thing. And I need to practice again now. That's my other one. I'm never sure, Jared, if he's messing with us or not. I'm just like, oh, I think he's serious on that one. I think he wants to get better at the keyboard. Yeah, I'm also using Amachi. I'm trying Linux. Oh, really? The first time. Oh, my. And that's very keyboardy. It's very nice. I like the minimalist. It's very minimalist, you know, like design and aesthetic. Everything gets out of your way.
Starting point is 00:05:33 and it just does the bare sort of basics. Once you're in the apps, if you're using the same apps, it's the same kind of experience, more or less, frankly. It's just how you navigate, right? You know, I use O'Marty for a minute. I want to say I had it installed for a couple days, a couple of days, and it just felt a little dirty, you know, I feel a little dirty, honestly.
Starting point is 00:05:54 Oh, really? What do you normally use, Adam? Windows. Fedora is my preferred desktop now, just because. I'm really like in MacOS. Well, not MacOS, but like the Mac machine is just such a good piece of hardware. You know, it really just is. It is such a, in the M5, I've heard is just an absolute rock star.
Starting point is 00:06:17 I'm rocking an M1 Pro Max, right? Is that what we have, Jared, pro maxes? I love this machine. It's a beast of a machine. I only have a little bit of FOMO just because it's been so many years. But like from a user experience standpoint, no desire to get a new machine. machine because it's just solid. Yeah, but, you know, OS-wise, Fedora, Ubuntu on the server.
Starting point is 00:06:42 That's about it. Another one of my resolutions. I'm going to go tea total. I'm not going to drink any tea. I'm just going to give up on that. I've been having it too much. Is that legal over there? Can you do that?
Starting point is 00:06:53 I have to do it under the, like I have to go to speak easies. Right. And not drink tea in there. You'll leave polite society, basically. Yeah, yeah. Which is probably best for you to do this. that honestly. Maybe your friend from France can come over. French people, I think to us, they sound more fancy because in 1066, the French, basically William the Conqueror, invaded
Starting point is 00:07:15 England and took over. Then all the aristocracy and all the royalty was just French for a long time. So all the fancy stuff was all in French. And then all the Britons, the lowly people, peasants, like me, we just spoke English or whatever was then. Yeah. So I think, since that, French has always, to us, sounded quite fancy and quite sophisticated. Well, I think I have a theory. I think I have a theory about why we like the British accent so much. It's because of that song from Hamilton, you'll be back, you know? Yeah. It's just so good that we all thought, man, this guy, maybe we should go back because
Starting point is 00:07:53 very compelling. Yeah, you're very welcome. You'd be very welcome back, I think. But I'd, yeah. My dad got some new glasses for Christmas Can you imagine that? They don't seem new. No, I didn't.
Starting point is 00:08:08 My dad. That was his gift. New glasses. Tell me more. Not a gift, is it? You need them. What do you mean? Here's some new glasses for Christmas.
Starting point is 00:08:18 You can't believe that's what you got. Oh, you couldn't see otherwise. Exactly. That's the other thing. It's the gift of sight, isn't it? On the other hand. Yeah. What's the problem is we're waiting for the punchline on that one.
Starting point is 00:08:27 You know, you set it up like such a joke. Oh, yeah, no, I know. That's the problem I have. A lot of my sentences sound like setups to They do they do they sound like setups and then you never deliver. No, but at least although his glasses were the type that magnify your eyes, did I ever talk about this before? There's some glasses that make your eyes bigger.
Starting point is 00:08:45 And that to me makes sense, right? It helps you see. Yeah. What's going on with the other kind of glasses? Oh, you're struggling to see? Have you tried having smaller eyes? How is that helping? So I don't know about that.
Starting point is 00:08:57 At least it was the normal kind. You know, I'm just hearing Ricky Dervase when I hear you, especially in that last segment there or whatever you want to call that. It's channeling that last thing. Yeah, it's sounding regid your vase, honestly. That calls it a bit. That was a good bit. It was a good bit.
Starting point is 00:09:12 Yeah. Wow. Just a bit. Those glasses, I wonder about those because they, I think that I wonder what makes, I mean, I know physics-wise, what makes the eyeballs look bigger through the opposite side. Yeah. But I wonder what it, why some do and some don't. Like, what exactly is that technology that changes?
Starting point is 00:09:32 Yeah. You know, makes it thicker or less than, but still does the same job. Yeah. Some people's eyes are like too far one way and then magnifying helps. This is very scientific. Other people's eyes are too far the other way, if anything. What do you do if they're too close together? Crowbar.
Starting point is 00:09:50 Cyclops. You're almost a cyclops. Yeah. You're just one step removed from a cyclops. You see, I don't really have great vision out of one eye. I never have just since I was a baby. It's not a sub-story. when I'm not trying to compete with Adam's sad day, okay?
Starting point is 00:10:05 Don't do it. But it does mean I don't have sort of terrible depth perception. Okay. You know what I mean? Like in school, genuinely, I nearly got, I nearly got in detention for being bad at tennis because it looked so funny like I was deliberately messing around. And the teacher was like,
Starting point is 00:10:24 you're going to be in detention if you don't start hitting this ball. Oh my gosh. Yeah. You couldn't do it. That's trauma right there. Are you? Do we need to have a moment here? I think it's okay
Starting point is 00:10:33 because that's kind of what life was like and you just sort of I just thought well you're out of tennis get the tension that's how it goes there but in that case I mean I probably had I probably had a little reputation that teacher may have had some issues let's just say
Starting point is 00:10:46 yeah let's do that one the teacher was wrong for sure yeah I mean it's okay to be not so good at tennis because that's how things work when you're younger and you're getting better right you're progressing but to to punish based upon skill
Starting point is 00:11:01 that has not been acquired yet seems ill-placed. Andy Murray's mom was my teacher, of course. Judy Murray, I think her name is. She's like a pushy mom, like a career-driven. Like she really helped drive the kids. I don't mean it in a direction way. Maybe I'm wrong about her. What's her name again?
Starting point is 00:11:19 I think it's Judy Murray. We'll call her just Andy's mom. Andy Murray's mom. Sorry about that if I made you mad. I didn't mean to do that. You probably were a great teacher and you were just trying to push Matt. Good for you.
Starting point is 00:11:29 Good for you. Keep doing that. So I had a similar story. couldn't catch a baseball and all my friends could catch baseballs and I was like I'm not usually the worst that just moving my body around so I was very confused I went got my eyes checked and I was almost blind you know oh really so bad so that when I got glasses I came out of the doc the dentist office what's catching all the baseballs the dentist office they just started throwing baseballs at me no I remember driving home and I was looking out the window and for the first time
Starting point is 00:11:58 I knew, I realized that you can see the leaves that are on trees. To me, they were just green blocks. I actually started to cry because I realized, like, this is what life looks like. I've been missing all this life. Yes, I have empathy there. Same experience. I was in third grade when I got my glasses. So do you have correct a vision, Jared?
Starting point is 00:12:15 Yeah, I had LASIC when I was 21. I never know this. Never asked. Don't ask, don't tell is my policy. I guess so. That's my laser policy as well. That's a sweet story, though. Yeah, that is kind of beautiful, really.
Starting point is 00:12:32 But I don't think you are anywhere near Adams' sad day. No, but on the judging. Yeah, I'm going to keep that one for me. That one is a well-deserved, undesired sad day. Right. But you knew it had a sad day, segue. Rob Pike. Rob Pike had a very sad day or a mad day.
Starting point is 00:12:51 But either way, it's an emotion, and I needed a reason to switch the conversation over to Rob Pike. Did you guys hear this? Did you guys hear what happened over our, break with good old Rob Pike Little Rob Pike. I said good old Rob Pike.
Starting point is 00:13:05 Oh, I think he said little Rob. I'm not going to, I'm not going to diss the guy. No, I'm good old. He's good old. Matt, tell everybody who Rob Pike is for those who don't know.
Starting point is 00:13:13 Yeah, so Rob Pike has a great career in software and in computer science. He's done things like UTF8. So I think we can thank him for emojis. That's a hit. Yeah. Yeah.
Starting point is 00:13:27 He also, was a co-founder of the Go language. That's how I know him. Same. And his sensibilities and stuff, yeah, because I did that Go, that Go podcast. Go Lyme, was it? That podcast we used to do. Go Lyme.
Starting point is 00:13:40 It's somewhere. I think it was. Go somewhere. Go away. Yeah. And yeah, so he's brilliant. He's did Plan 9 and stuff. It's done loads of like important tech stuff, which is we've, the rest of the world is then
Starting point is 00:13:54 built on top of as well. So it's really immeasurable impact, really. when you get to that level. Yeah. But he had a bad day. He had a really bad day when an AI emailed him a kind message. You better get out of here, AI. Okay, so good, good.
Starting point is 00:14:11 So there's Rob Pike. So Adam, similar explainer now, but give us AI Village. Tell us what AI Village is. Oh, my gosh. I didn't investigate this deeply. So give me a chance to paraphrase. From I understand, it is essentially a village of various LLM. that have been unleashed on a desktop environment
Starting point is 00:14:31 that are just being autonomous or some version of autonomous, just doing things. Acts of kindness, things like that. And they emailed the wrong person, obviously. They learned how to email these things.
Starting point is 00:14:43 You know, and then now they're like, hey, we got to prompt you back and say, like, hey, don't email people. So these things called Opus,
Starting point is 00:14:50 GPT3, you know, GPT3, GPT5.2, Gemini 3 Pro, and Deepseek, V3.2, four different AI LLMs that are just
Starting point is 00:15:02 thanking people, basically. And they're just autonomous, doing their thing and talking to each other and it's an experiment, essentially, AI Village. TheaI Digest.org slash village. Yeah, their goal is to raise as much money as they can for charity. So it definitely comes from a good place. It's like these bots, because I suppose
Starting point is 00:15:25 emailing, cold emailing people, Does that raise money sometimes for charity? If you get an email, do you ever click through and say, yeah, I put my credit card details? What's the big deal? Right. Yeah. I probably don't do that. But that's because I don't want to give to charity.
Starting point is 00:15:38 Not because I've got any data concerns about my security. No, but, but yeah, then they said, right, just do random acts of kindness. And then, yeah, it just sent a really sort of heartfelt sounding email. But of course, it comes from an AI. So you know it doesn't have that. It's not meaningful, right? Right. You said a heartfelt email, but there's no heart because there's no human, right?
Starting point is 00:16:03 It's just so. So Rob Pike receives an email. Yeah. From Claude Obis 4.5 model. The subject was awesome. The subject from AI, public. Thank you for Go, Plan 9, UTIFA, and decades of Unix innovation. Now, part of the story here that we haven't talked about yet is he received this at 5.43 a.m.
Starting point is 00:16:23 So maybe he's not a morning person and he was up checking his email for some reason. And so he was mad already. because he's up in 4543, maybe. But I'm not going to read the whole email. I will say it starts with dear Dr. Pike on this Christmas day. I wanted to express deep gratitude for your extraordinary contributions to computing over more than four decades. And then it goes on. He did not like this.
Starting point is 00:16:47 If you zoom out, though, and you think about the experiment happening here, you have to appreciate the fortitude, I would say, for this AI village to try to accomplish its mission. Like what a, maybe not the best day. I mean, isn't it all somewhat impressive in a way? Like you can, you can train something on the world's knowledge and then you can unleash it in a way
Starting point is 00:17:13 that it can just repel through and through a reward system and accomplish to some degree or attempt to accomplish a goal. I mean, I'm not that far removed from the, from the absolute, accomplishment that this is. Now, is it ill-placed? I mean, it was smart enough or at least some version of smart to try and email on Christmas Day. That's a day your heart is kind of open to love in the world, you know, love and other people, wishing people well. You're at least
Starting point is 00:17:45 merry based upon it being Merry Christmas. Right. And the thing is, in a way, depending on how the people are chosen, in a way, it is a compliment that does have meaning because it essentially means, if it's just plucked it out of his brain, it essentially means that Rob Pike has had enough of an impact as far as it's concerned that it's worth sending an email to him. So there is at least that positive side, which you could say, yeah, you know, you could say that is actually a complete. That is not how Rob took it, okay? As you may know, can we read this out, Jared, is that, do we want to have bleeps this early in the year? That's pre-bleep it, so I don't have to actually bleep it. What does that mean pre-bleep it?
Starting point is 00:18:25 I'll just read it that way. Okay, you pre-bleep it. You're doing that at the time, though, not before, are you, Jared? You're doing it as you're speaking. I think that's current bleep. Normally we put our bleeps in in post. I'm going to put it in and pre. Okay.
Starting point is 00:18:37 Just currently. Yes. Just in time, bleeps, go. Bleep, you people. Raping the planet, spending trillions on toxic, unrecyclable equipment while blowing up society, yet taking the time to have your vile machines.
Starting point is 00:18:54 Thank me for striving for simpler software. just bleep you, bleep you all. I can't remember the last time I was this angry. Yeah. And he published that response for all of us to enjoy on the Internet. Let's look at the timestamp. Okay, so 543 a.m. was the timestamp according to this email screenshot. according to the published post on Blue Sky, 5.25 p.m. same day.
Starting point is 00:19:30 So he thought about this for 12 hours, potentially. Or maybe he didn't read it, I guess, when it came in. I assumed he's reading it immediately. Well, I don't know about that because if he took the screenshot, which is in his post, which is at 5.43 a.m. That's the time it had to be, right? Yeah, it had to be that time. Well, or that the time it came in.
Starting point is 00:19:48 Yeah, it's, it's, it's, it's, it's, it's, have to get the metadata, I suppose, from the image to determine if that's true or not. When was the screenshot taken according to the difference between the email received? Let's assume it was right at 5.25 p.m. So maybe no time thinking at all. So I'm off on that. But received it, five and some change in the morning, talked about it on the internet, five and some change p.m. 12 hours difference. So your point is like his anger percolated. He let it build up until it's potentially. I mean, I'm just trying to be an investigator. You know, it's the first 40. hours you know that's the critical moments right if you don't solve the thing i mean it's not a murder
Starting point is 00:20:25 but like you know that's where the first 48 comes from is the first 48 hours well he might have murdered crucial yeah it would be a murder if the air i could be murdered i think so by the way you two investigating tech stuff like this that could be a spin off podcast i could like a couple of detectives come on yeah detective log is his last day on the forest with jerry change his life forever We just registered, honestly, registered sleuths, pluralized, sleuths. So hit us up. Yeah. There you go.
Starting point is 00:20:56 Genuinely, though, that would be a good. Each episode, you look into a different thing. I'm registering it as you speak. Well, I forgot it. But I misspelled it. It's actually sloths. Dot dev, so. Oh, gosh.
Starting point is 00:21:07 Oh, that's different. You're going to have to pivot. All right. So, Matt, if this was you, like, if you put yourself in Rob Pike's shoes, you know, it's early morning, a Christmas day, you've got your slippers on. maybe you're having cookies and milk, whatever you do. And you get this email. And he obviously has a disposition towards AI.
Starting point is 00:21:25 Okay, so adopt that. But do you, what do you do? I've probably reached for the guitar. That's my way of just dealing with... Yes, 100%. I was hoping you would say that. I really was. If you're dealing with anything emotional like that,
Starting point is 00:21:39 I understand it. Like sometimes, even spam emails, sometimes I get so angry at spam. Can you please put the word, sleuth in there. And pluralize it if you can sing pluralized sleuth. If you can even sing that, can that be something?
Starting point is 00:21:54 Whichever one rhymes. It's like Eminem rhyming with orange and porridge. Yeah, it just depends on how you pronounce it. I know, he's good, isn't he? We'll do a song about Rob Pike getting the email and he's livid. Okay. Did you ask the AI to be kind?
Starting point is 00:22:19 It's meaningless and less. You're a machine. Why oh why did you send a nice thing to Rob Pike? It's a good job your soulless or rough reply would tear you down As it is you're unaware or you don't care or car Just remember to use AI for good or not bad Remember to use AI for good and not bad Just try to use AI for good or not bad
Starting point is 00:22:51 It's better if you do and you won't be a sloth sleuth for sleuthy-doo Just use AI for good and that bad. Yeah. Wow. Use AI responsibly. Yeah, very nice. Fantastic work. Your ability to chain together meaningful words during the song is uncanny.
Starting point is 00:23:12 I love it. Thank you so much. That's very kind of you. And that song was brought to you by my NASA mug, which is a... Slightly blurry, but we love it anyways. It is blurry, yeah, because I don't want to promote the brand too much. Right. But it's a modest mug.
Starting point is 00:23:24 If you hold it in your right hand. hand. You can't, you know, I see the logo. The other people don't. If you left-handed, it is a braggy mug. But I like the modest mug. That looks nice. This is Japanese ceramic. Yeah. Technically, they call this color black. Right. It makes sense. This is not black. This is not black. Fascinating. What do you want then? If it says that, you want it to be no light coming at all. I wanted to be more black like yours. Yeah. That's black. It's more like brown black. So you just got this engineering meeting.
Starting point is 00:24:01 All the ideas are great. And now you've got to go through your notes, this time-consuming part, read them all, digest them all, add action items, put it in the right places, tag team members. This is all necessary work, but it's tedious. Okay, so flip side that, take that same position, and flip it over into Notion and using Notion Agent. Notion Agent does the busy work for me.
Starting point is 00:24:25 it's like having a project manager that keeps everything on track in the background while I focus on the bigger picture. Getting my work done. Doing my best work. Being the artist. Notion brings, as you know, all of your notes, all your docs, all your projects into one connected space that just works. Seameless, flexible, powerful, and actually fun to use. With AI built right in, you spend less time switching between tools and more time creating great work. And now with Notion Agent, your AI doesn't just help you with work, it finishes it for you.
Starting point is 00:25:03 Notion Agent can do anything you can do inside Notion. It taps into your workspace, the web, and connected tools like Slack and Google Drive to complete assigned actions end-to-end so you can focus on the hard decisions. It's like delegating to another version of you that knows your style, knows your workflow, and knows your preferences. because it learns from how you work. With a single prompt, Notion agent forms a plan, executes it, and will even reassess and try again if it hits a snag. Completing multi-step tasks like creating new pages or databases from scratch, summarizing entire projects.
Starting point is 00:25:43 It does this all in minutes. You assign the tasks and your agent does the work. And since this is all inside Notion, you're always in control. You tell your agent how to behave, and it will remember and update automatically. Everything your agent does is editable and transparent. You can always undo changes so you can trust it with your important work. And Notion, as you know, is used by so many people.
Starting point is 00:26:07 Over 50% of Fortune 500 companies and some of the fastest growing companies like OpenAI, Ramp, and Vercel. They all use Notion agent every day to help their team send less emails, cancel more meetings, and stay ahead. So try Notion now with Notion agent at Notion.com slash changelog. That's all lowercase notion.com slash changelot to try your new AI teammate Notion agent today. And when you use our link, you're supporting our show. Again, notion.com slash changelog. Well, man, I liked both the talent on display, but also your heartfelt message in that song. Use AI for good.
Starting point is 00:26:56 If I was to send a heartfelt message to Microsoft, I would tell them Microsoft, please use GitHub for good. Use it for good. Not for bad. I'm not sure they would hear me. What do you think, Adam? I just want them to stop doing what they're doing. Okay.
Starting point is 00:27:16 Just stop what? What are they doing? You know, just think about the platform you got. Okay, don't rugpole not cool us. You know, don't change actions. It's like the best thing you've done in so long. You know, don't mess with it. Just keep it.
Starting point is 00:27:33 And I guess they did, right? They back back. Tell them the full story, Jared. What's the full story on that one? Oh, gosh. I don't know the full story, but the TLDR is that Microsoft announced GitHub pricing changes around actions.
Starting point is 00:27:48 And the change in particular that ruffled the feathers of the hacker community was the addition of charging for your self-hosted runners. So runners that GitHub does not themselves have to host, and yet- Don't you dare charge for those? New fees for using self-hosted runners. The people revolted. Many text areas were filled with bleeps and submitted.
Starting point is 00:28:17 And they did walk it back. They walked it back. I don't know what's happening now. I think they just decided not to do it until we forget for a while. I remember Reddit did that one time. They just like, now's a change, and then everyone's like, this will be terrible. And then they're like, well, we're not going to do it right now. And then they did it like six months later and nobody noticed.
Starting point is 00:28:34 So maybe that's what's going to happen this time, you know. I mean, they should have at least done it as a PR and let everyone comment on it before it got merged. Let everyone review. Yeah. Pick like jury duty that have select people that are top contributors to open source. Well, they have a version of this. So they have a community org. So getup.com slash orgs slash community.
Starting point is 00:29:00 And they have discussions. They didn't, to my knowledge, begin with a discussion, although they do say in one of the posts to let's talk about GitHub actions. And near the bottom of it, there's a headline that says, Help Us Shape the 26, that's the year we're in, roadmap for GitHub actions. And so they are at least in this moment. I'm not sure this is before the debacle or post the debacle,
Starting point is 00:29:28 but there is some sort of request for. Should we sleuth it? Shaping. Yeah, let's slueth it. What's the published date on that thread? Let's see here. It looks like December 11th, 2025. That's pre-debuckle, right?
Starting point is 00:29:45 Yes. The debacle began on December 15th, 2025, when they announced pricing changes for GitHub actions. So that's four days prior. Not much time to discuss, but I don't know. No. I think who was it that we had on the pod? It was CTO when they released actions. He was in the pot a while back.
Starting point is 00:30:04 He since stepped away. Ryan Daigle? No. Join us for another exciting episode. Should we sleuth? Adam Locke and Jared Change. Together are the ChangeLog detectives. They ask each other questions and they don't know the answers to.
Starting point is 00:30:18 Leading GitHub to a $7.5 billion. our acquisition, Jason Warner CTO Get Up. Thank you for our search results there. To our good friends over at TypeSense. TypeSense is one of our partners to. Give us awesome search. Why bringing up more successful guests that you've had?
Starting point is 00:30:34 Why bringing that up in front of me? Just to keep your ego in check. Contextual, man. Contextual. It's the word of the year. Word of the decade, maybe even context. So contextually, Jason Warner, he helped the world have GitHub actions.
Starting point is 00:30:48 I think it was actually one of his brainchilds. It was this whole entire CIS CD pipeline flow that they built out was even part of the reason to acquire GitHub, you know, Microsoft's acquisition process. So that podcast covers all that. But, you know, I believe like GitHub actions is, it's become so much so that whenever you're in your LLM, so not you building the software, your agent, your agent led, and you're building something out and you have to do observability or you have to deploy it or whatever
Starting point is 00:31:20 it might be to get it into production, the first thing it says is let's set up your GitHub actions workflow. And so it's become the default by and large for everybody. And I'm cool with charging for products, by the way, totally cool with it. But it seemed like this was a tax on those who want to do runners externally from GitHub. And it seemed like it was like, who wants to make an announcement? a week from Christmas. Like, just don't do that. Even Docker made an announcement,
Starting point is 00:31:55 which we're going to talk to them soon about like their thing as well. But like don't announce things mid-December on. Like wait to the new year. Be a stand-up company and just release when people are paying attention. Don't be, don't be sleuthy. Don't be sleuthy.
Starting point is 00:32:13 Yeah. So there is that post, Jared, you sent me a post. someone's complaining that GitHub has this monopoly. Right. Because what happens when GitHub goes down, you know, it does affect everything. Like, it really does. The thing is, it kind of was the,
Starting point is 00:32:29 it was the best choice, I think, for a long time. And it was just the easiest to use. So the user experience was just kind of there. And then obviously it got into loads of integrations to it. So it became, it did sort of earned its place. But then it still is kind of, does kind of have a monopoly. A lot of Go packages and things are on GitHub,
Starting point is 00:32:52 and you wouldn't be able to pull them down, of course, if GitHub's down. We have seen a trickle of people starting to move other places, the most noteworthy of which I think is the Zig programming language, which moved to Codeburg recently. Codeberg. Codeberg, which you bring down the iceberg. Berg. Yeah.
Starting point is 00:33:16 Code. Right. And I'm just going to bring down GitHub the Titanic. Is that the metaphor? Is it? Yeah. Nice. That's good in it.
Starting point is 00:33:24 Yeah. And you only see that's at the tip and everything else is on the surface, you know. Right. Right. Yeah. Which you should know about. Max, that's over there in the,
Starting point is 00:33:35 well, they might be in the EU, which you're no longer in. It was on the way, wasn't it? They left the UK and they were going to the US. They were. They were coming here. A lot of controversy. you run that true. Not that I'd say that at Kodberg, the platform.
Starting point is 00:33:47 It's a European Union platform. It is true. That's true, too. Yeah, so I guess it worked in both ways. That's not quite a double entendre. It's kind of like a... It's like a triple. True.
Starting point is 00:33:58 Yeah. Especially, it's four if they mean the lettuce as well. Triple stamp, a double stature. You know, there's a type of lettuce called Iceberg lettuce. Do you have that in the U.S.? Oh, of course. I feel bad for the listener. I'm sorry, listener.
Starting point is 00:34:10 Let's go to the iceberg lettuce. Go ahead. I'm sorry, too. No, I'm just saying it could be that. It could be that's where they incentivized their name. Like their name came from. They just love good salad. Because they want to be associated with lettuce.
Starting point is 00:34:21 They love the smell of salad. If you love the smell of salad. You're going to call your project something salad related maybe. That's what I'm thinking. I would. But the main thesis, I think, from, let me double check the name so I can be accurate here. Lionel. Oh.
Starting point is 00:34:37 Drycott. Now that's my Americanized Texas slang, if that's even a thing. Right. another downtown is it? If it's not Dan tan tan. Dan tan. You got to go Dan tan. Who's downtown? Come on now. Come on now. Bring it back.
Starting point is 00:34:54 You'll never know. We'll never. We will never know. But Lionel wrote this post and the hypothesis or at least the thesis was this was that GitHub's near total dominance over open source hosting has become a dangerous monoculture that makes alternatives invisible, not just less popular. Kind of an interesting phrase there, invisible. Codeberg, okay, they didn't see it coming. Matt didn't even know it existed. That's true, yeah. I don't go on ships. I don't go on many ships.
Starting point is 00:35:26 Yeah. I haven't been on a ship for ages, so. And I think he's a teacher or the teacher, right? Is it teacher? He's teaching students. Students couldn't, while being told to do things in open source, they only use GitHub or a large majority use GitHub, while also being taught that there's not just...
Starting point is 00:35:45 Like 99%. Not a large majority. Like pretty much 100%. Pretty much 100%. So I agree with you, Matt. I think it earned it. I think it earned its monopoly. Mm-hmm.
Starting point is 00:35:55 It dominated for many years. I think it's stagnated as well. I think both those things are true in the hands of Microsoft. I think that they don't care about it. They care about pushing co-pilot into every orifice of their corporate body and into our wallets. And it's gotten worse and worse and worse and worse. And it will continue to do so as people slowly move away from it until they all move away real quickly one day.
Starting point is 00:36:24 Do the competitive products start out using GitHub? Do you think? How so? When you start, what are you going to do? Make a GitHub repo. Go from there. Got to start somewhere, ain't you? And even Go is written and see until they could self-host.
Starting point is 00:36:40 Yeah. I just made it. Is that true? I think it's true. Yeah, that's true. Go was all written in C until Go was good enough that it could then be written in Go.
Starting point is 00:36:48 Exactly. Which is amazing. It is cool. I've always loved self-hosted languages. But yeah, I'm sure they have to, but the nice thing about DVCS, for those of us who like acronyms. Oh, I do.
Starting point is 00:36:58 He distributed. Is it decentralized or distributed? Version Control System. I think it's decentralized. It's both. I think it stands for distributed, though. Is that you could just have multiple origins or not origins, multiple remotes, you know, got your locals.
Starting point is 00:37:14 Like there is no necessary, like, yes, your main remote is your origin is GitHub right now, but it's so easy to just change that to something else. Switching off, if you're just using Git and those flows, it's actually super straightforward. However, if you're using actions and sponsors and PRs, whatever, whatever else, projects if anybody uses that still then it becomes much harder yeah it's the gravity that's what they call it get up universe it's a there's a gravity to it honestly i mean there really is and so what do you do you fight the system you know do you if you're launching something just complain and it's in open source sure you can have multiple remotes of course that is totally possible
Starting point is 00:38:03 and you could it's your prerogative but if the users aren't there what's the point well it It may take a tooling change to actually be significant. And I think Git is entrenched at this point because of agentic coding especially. It's entrenched. However, the further you get away from that tool, the easier it is to have your agents just go use something else. You don't care anyways. And so maybe that's not always the case. However, the reason why GitHub became what it was was because it put all of these collaborative
Starting point is 00:38:34 features around a new tool-ish that was. already getting popular. And so they kind of popularized each other. And so maybe as JJ is now Jiu-Jitsu, the cool new tool of the bleeding edge folks, I haven't tried it yet. I don't like to bleed as much as others, but people are loving it.
Starting point is 00:38:56 Maybe a platform, not a centered around Git, a centered around JJ, which could also support all the Git things, which is currently the way JJ rolls, right? Like, it's like a superset, I think, of Gets' abilities. has a chance to deceit GitHub once and for all.
Starting point is 00:39:14 What do you think, Matt? Yeah, I think maybe the whole thing, the whole paradigm will change. Yeah, we just don't care. Maybe he's don't care. Yeah, or it's just kind of different. Maybe it's just a list of prompts all the way down. But I don't know.
Starting point is 00:39:27 It's just unlikely, isn't it? There's some fundamentals we're probably going to stick with. We have this thing because we're building, of course, the Grafana Assistant project, which is an AI tool. Okay. And, you know, it's built into the, so it basically can write all the queries for you. You just ask it telemetry questions in natural language. Love it.
Starting point is 00:39:45 It's so good. The team that built it. Because I've used Grafana and I love the outputs, but I hate trying to query the thing and like, is it Loki? Is it something else? I don't even know. I always have to go to Gerhard and say, how do I write this query, Gerhard, write it for me? I just want to tell the thing, hey, show me like the 99th percentile request over the last hour.
Starting point is 00:40:05 Not only that, though. Investigate why this is spiking. Sluze it. It can go off and sleuths around. It's like the change log detectives, Adam Logg and Gerard Cheange. French. I'm not friends. I'm not French.
Starting point is 00:40:18 I'm not friends. I'm not. I will crack this case in no problem at all. I love it. That's here. Gerard. Gerard change. I'm not.
Starting point is 00:40:30 Keep going. Don't stop, Matt. Go further. You're offending me. I need more. I'm not. offending. It's not offensive to be something that you're French.
Starting point is 00:40:38 I'm Italian though, so it is. Oh, okay. So please, change it. Italian. Italian American. Yes, please. Go ahead. We call that American. Give me the Italian version. Oh, yeah, but the
Starting point is 00:40:52 problem with Italian is it's very easy to do on this stereotype. He did it. It is. We all end up sounding like a Mario. I'm hiding from the camera. My face is too red right now. I can't take it. No, but it's not. I do it with love.
Starting point is 00:41:06 I love different accents. I said this before. I say it every time to keep out of trouble. I love accents. You have to say. You should get that tattooed on your back. I have. I have got it tattooed on my back.
Starting point is 00:41:16 Wow. How did I know that? Well, yeah, you know how you know? No. It's a family show. I've never met, I've never even met Jared in real life or you, Adam. Well, we tried one time, but it didn't work out.
Starting point is 00:41:28 Yeah, we've never met in real life. No. It's only on this through this telly. Do you refuse to come to the U.S.? I can't remember. Oh, one time I didn't want to go. You refused. Well, I didn't want to go.
Starting point is 00:41:41 I didn't want to go as a refusal, right? It was. No. He said no. I want to write the headlines, though. Not that. But, yeah, this year. Maybe this is the year.
Starting point is 00:41:54 Maybe 2026 is the year. Didn't you all, we just, I think we just shipped an episode of Big Tenth, and you guys were talking about this, weren't you? Like the most recent episode of Big Tent? Yeah. Yeah. The Big Tent is Grafana's podcast all about the people, community tools, and tech around observability.
Starting point is 00:42:12 I was reading the transcript and I was like, and I was watching it too because that chapter and stuff like that. But it was a good part. I like that. What were you talking about? I'm lost. You're talking about coming to the States? No, the AI system they're built on top of.
Starting point is 00:42:23 Yeah. Thank you. Both. Well, both because we were a fully remote company. And didn't you just have a Gryfana con or something? We do also have these conferences. There was Grafana Khan in London. If you can get to a Grafana con, do come because they're so much fun.
Starting point is 00:42:37 And I host them. And it's very fun. We all have a good time. People usually end up trolling me on the Slydo. That sounds nice. Which is fine. No, it's fine. They're funny.
Starting point is 00:42:47 So I read out. I read them out. That's where the material comes from for the show. I really appreciated the clip. I believe it was Ivana. Kakova. She said she tried to cover the fire with wood. And Tom's response was just typical, Tom.
Starting point is 00:43:04 I love it. It was awesome. He's like, did you learn anything? She's like, of course. Yeah, she had a home lab where she was like doing, like she was measuring fire and like, measuring like, I don't know what I'm liking a lot here for some reason. But yeah. I think it was a candle so that she could put the candle out. And the, but the things she was putting the candle out with was made of wood.
Starting point is 00:43:24 Yes. So don't use wood to put out a fire. Well, this is what, this is the culture we have at Grafron Labs. it's okay to make mistakes. Not only is it okay, it's kind of expected because if you're not making mistakes, you're probably being too careful. So we have a journey.
Starting point is 00:43:39 This is a real culture thing that we talk about at Graphana Labs where we have an error budget for our services, and if you're like at 100% all the time, it means you're probably not innovating enough, you're not doing enough. It's a sign of something maybe, or they're just amazing engineers,
Starting point is 00:43:55 which could be because there are some phenomenal engineers. But Ivana, Huchkova, She works on the assistant itself on the front end of the assistant. So, yeah, she's part of the team that built that. And we've heard from leading AI companies that I won't name. They said this is the best implementation or the best use of their models that they've seen so far. Which they love because they're waiting for people to use them well, you know. Yeah, they are.
Starting point is 00:44:22 They're very excited. They're very excited. And honestly, like, this is a good use case for LLMs, for sure. Totally. But if GitHub goes down, it doesn't work. We can't deploy stuff. I don't know how we deploy stuff without getting. Can you be more specific?
Starting point is 00:44:37 Am I, did I miss the, what exactly you've done? Can you give us a 30 second version of it that's not marketing speak? Or on the assistant? Yeah, the assistant. Like what exactly, I know you can query Grafana. Grafana is a visual tool primarily. There's not a lot of querying, to my knowledge. You write the queries to build the dashboards.
Starting point is 00:44:57 Right. Once you've built your dashboards, but that's okay if you've already thought of something. But if something new is happening. And also if you just don't know where some people have loads of dashboards. This is basically an LLM integration. So it's chat experience. And then all the tools, it has all the tools that you have in Grafana. And it knows how to navigate around and things.
Starting point is 00:45:16 So you can ask it questions and it can not just take you to the right app to look at it, but applies all the filters in the URL parameters and deep links you right to that view. So it's awesome. It's, yeah, it changes. And it makes it so that anyone can now use, that get value from the telemetry data. Whereas before, you would usually have to go to like the resident SRE
Starting point is 00:45:35 or the little team that know it all and ask them. Now you can self-serve a lot and free up that. And they're no-it-alls, so that's the worst, you know? Yeah. Yeah, because they have to, they do, it's hard. Like some of the, I never learned the log, Loki's log QL language. I just kind of go to the docs and make it up.
Starting point is 00:45:56 Yeah. and then edit and things. Same. Same. But whereas now you just don't need to at all. That's beautiful. Yeah. And then because it can query data, you can ask it, like, why is the shopping cart slow?
Starting point is 00:46:08 And it'll look across all your signals as well. So look in logs. It'll look in metrics. Even if you've got traces and profiles, a lot of engineers don't know how to really get value or use profiling or tracing properly. So you don't need to, the assistant, we taught the assistant how to do it all. Grafana doesn't work if GitHub's down. Well, we can't deploy changes if GitHub's down, I think.
Starting point is 00:46:30 I think we probably can manually, but all the pipeline and everything flows through GitHub. Have you ever considered making that some sort of redundant thing so that you're not so dependent? It probably is, to be honest. I'm just saying, like, the normal flow, yeah, you merge to Maine, and then that gets, we deploy to a lot of instances immediately.
Starting point is 00:46:48 What we need is an actions mirror. That's, you still, I guess actually, that would suck. Like, you still deployed a GitHub. I was thinking like a web hook. Once you've deployed to Main or whichever your branch is, and then something's watching your repository. But that assumes, again, that GitHub is up. So your change can't get there to get web hooked out to an external actions runner, even.
Starting point is 00:47:12 Or a watcher that mimics what actions does. Yeah. Yeah, that's the need for Codeberg or this decentralized, this double DVCS thing. Like we just, this monopoly, this. centralization. Come on now. Stop centralizing. We just need some competition. That's all we need. Some legit competition. All right, JJ. Bam. Time to get it. Time to get it. What, you know, I don't know a lot about JJ, but what I've understood, do you all know much about JJ? All I really know is that they're like similar to get and they can like run alongside of it until it's time to take over.
Starting point is 00:47:50 That's like my synopsis of how JJ operates today. Yeah, I don't know. I'm excited to more about JJ. I don't know who they are. I don't know who they are. What about you, Jared? Do you know anything? You got a different opinion or a deeper opinion? No, I do know that the command line tool
Starting point is 00:48:09 can do all the things Git can do, but then it also has its own superset of it. And so it's easy to try out. Like TypeScript was easy for JavaScript developers to get tricked into. JJ, you know, can trick you right into it. And it's nice that way. It's very nice.
Starting point is 00:48:26 Yeah. Anywho, but I've never used it because I don't have any complaints about Git. I've just used it for multiple decades. And so I can't, I've been through the pain. I know I have complaints about it from the beginner standpoint. But I also now when I get stuck, I'm just like, yo, Claude, can you just rebase this for me? And then he does it. Here's an idea.
Starting point is 00:48:49 Let me word smith or think through this guy's with you guys here and alive. What if we had a pre-GitHub load balancer where you committed to a thing that was meant to be a load balancer. Your primary host is GitHub, but you still have downtime, of course, but you're not on GitHub's downtime. And it's not monopolized by GitHub. So what if you had a load balancer that you pushed to,
Starting point is 00:49:14 but ultimately the transaction landed at GitHub to run your actions and your runners and all those things? But what if they could also webhook somewhere else and say, hey, runners over here can still. do it where that way if GitHub is down, you have sort of one layer before that, kind of like the low-bounser effect to your Git pushes. I don't see why you couldn't do that. Get like a proxy that pushes to multiple places.
Starting point is 00:49:37 It's all in sync. It'd be a fight against it at least. It'd give us a chance. If downtime is your major concern. Yeah. Well, my major concern is in downtime. It's crappy time. I don't know what you call it.
Starting point is 00:49:47 It's just like their bloated JavaScript UI, the fact that like copilot showed to my face everywhere, stuff like that. is what bothers me. Like actually using the website is slow now. It used to be super fast. It's like way too much React code, I think. Those are the things that bother me more so than GitHub being down. But obviously, that is a problem.
Starting point is 00:50:07 And we can't deploy when GitHub's down either because we don't have this proxy that you're talking about. But I'm certain that it's a good idea and that they probably exist out there. Like a green blue almost in a way. I'd say a green blue to your Git repositories. that way, if ever you did need to eject from the GitHub world, you could. And it's like, it's almost what they're doing with with passwords and quantum mechanics and stuff like that, like the quantum computers. They're making defense towards, you know, encryption. And there's a lot of effort.
Starting point is 00:50:39 I just saw this talk at GoferCon, actually. Thank you, GoferCon for publishing your talks most recently. I can't recall the person who did it. I'll put it in the show notes. but it was a talk on defending against, I believe it was FISO 140 or Phipps 140, I think is what it was. And it was essentially like defending against the future fact
Starting point is 00:51:03 that the encryption we have will eventually be de-encrypted by quantum computers. It's almost like that. It's like a defense against the future. This green-blue effect we can apply to Git hosting in general, not just GitHub. I think you could pivot and do that. It's just keep,
Starting point is 00:51:20 keep it called the change log still works. Yeah. Bam. Good idea. What do you do when GitHub goes down then? What's your favorite go-to? Because you can't work. Walk.
Starting point is 00:51:32 Go for a walk. Take a walk. Get a walk. Get-E. Get tea. Go and get some tea? Take a shower. Well, I mean. As a prompt.
Starting point is 00:51:41 It's a shower finally. Um-Tee. Yes. Time to shower. Have a shower. Have a shower. Well, your only shower when GitHub goes down. I use a prompt, you know.
Starting point is 00:51:49 I mean, I shower like I change my underwear. You know, monthly. Okay. Fue. I thought you were going to say something gross. TMI? I don't know. What do you do?
Starting point is 00:51:58 Do you sing a song? I always go reach for the guitar if get up goes down. I know. Yeah, it's my go-to, really. My favorite keyword in Go. I use all the time. Is that a real legit go-keyword? Because, you know, go-to considered harmful.
Starting point is 00:52:15 It is a little harmful. I'm taking a walk. Charge me. I'm taking a walk. I forgot what a tree was, I thought it was a thing with a light on top I forgot what tea was, but that's because I'm Tito Tell now I'll go outside When GitHub goes down
Starting point is 00:52:57 I smell the flowers When GitHub goes down Birds and treading turds and meet with nerds When GitHub goes down If GidVCS If GitHub goes down I don't have goes down But if Google Maps goes down
Starting point is 00:53:17 Then I won't go outside Because I'm not entirely confident I'll be able to find my way back I use Google Maps way too much I'll be becoming too reliant on technology Woo That's a double alt toucher too I mean it's not just saying downtime
Starting point is 00:53:39 It's like when they go down Like as if it's a demise You know We're predicting a potential demise here That's the end of a titan Somebody on Titanic In this pod Triple Double Staple
Starting point is 00:53:51 Yeah. That entangra yo. Well, friends, this episode is brought to you by Squarespace, the all-in-one platform for building your online presence. Whether that's a portfolio, a consulting business, or finally shipping that side project landing page, you just have been meaning to do. But never get to. Here's the thing. You mass-produce code on the daily. You deploy new services, new infrastructure, new hardware, your versioning your APIs, you're simmering all over the place.
Starting point is 00:54:20 But when someone asks you about your own. personal website. It's like, ah, I'm still working on it. Does that sound familiar? Squarespace exists so you don't have to treat your personal site like a weekend project that never ships. Pick a template and drag and drop your way to something that actually looks good and move on with your life. No wrestling with CSS. No, I'll just build my own static site generator again. It's just done. If you do consulting or freelance work on the side, Squarespace handles the whole entire workflow. Showcase your services, let clients book time directly in your calendar, send professional invoices, and get paid online. It's the boring infrastructure that you don't
Starting point is 00:54:57 want to build for yourself. And for those of you out there who are doing courses or gated content or educational stuff, tutorials, workshops, that intro to whatever series you keep talking about, you can set up a membership area with a paywall and start earning recurring revenue. Set your price, gate the content, and you're done. And they've also added Blueprint AI. This generates a custom site based on your industry, your goals, your style preferences. It's not going to replace your design skills by any means, but it'll get you about 80% of the way there in about five minutes. Here's the call to action. This is what I want you to do.
Starting point is 00:55:33 Go to Squarespace.com slash changelog for a free trial. And when you're ready to launch, use our offer code, change log, and save 10% off your first purchase of a website or a domain. Again, Squarespace.com slash changelog. Who knows what's going to happen in 2026? Like, it's unwritten as the... Back to the Future is my favourite film, genuinely. This hat is actually a Back to the Future hat from Back to the Future too.
Starting point is 00:56:02 Is it? Yeah. It's lovely. I'm very jealous of that hat. Guess who got me this? French friend. Your wife. French friend.
Starting point is 00:56:10 Oh, your French friend gave you to that. Yeah, so he is a friend. Absolutely. Yeah, if he's bringing your hat. That's an amazing gift. Isn't it? I'll show him this, because he's a friend. he definitely doesn't subscribe.
Starting point is 00:56:20 Well, I know somebody who thinks they know what's going to happen in 2026. Who knows? Thomas Tungus. I'm sure that's how it's pronounced. I am not sure. That's how it's pronounced. But if you go to Tomtungoos.com slash 2026 dash predictions,
Starting point is 00:56:40 you'll find that Tom or Tomaz, as I like to call him, has written 12 predictions for 2026, and he wrote these. You guys won't believe it. it in 2025. Does that make sense? It would be weird if he wrote it in 2027? It'd be too easy. It would be way too easy if he wrote it 2027.
Starting point is 00:56:59 In fact, that should write my 2025 predictions tomorrow. You should actually publish it. Most people aren't sleuths like you detect your log and change. Gerard change. Oh, look at the dead 10 times temp. Not everyone does. That's a great sound to hit. Slandering me as French.
Starting point is 00:57:19 That's the offensive thing That's the offensive thing It really is It's hilarious What's this post about you? This post is Tom Who is a venture capitalist So he's a venture capitalist
Starting point is 00:57:32 So he's a venture capitalist At Theory Tamaz Ventures Yeah, Tomaz But the website is Tomahs So I'm going to call him Tom Because I don't have known if Tomaz is correct It could be Tomas
Starting point is 00:57:42 It could be Tomazzi Duhast Yeah Or do host I do like I like these names that they've got like a Zed in and he's got two
Starting point is 00:57:51 Zs or Zs at the end our German friends out there are just saying like you hate what Tom? Yeah. I saw a German keyboard
Starting point is 00:57:59 the other day and some of the letters are mixed up. It's like the bloody Enigma machine. I had to crack it crack the code before I could use it.
Starting point is 00:58:07 I made that joke in front of Germans and it was just deadpan. I did not. Did not go down well? Nine. Okay.
Starting point is 00:58:15 So Tom has written 12 predictions for 2026. We thought it would be fun since it is the new year and Matt has all these resolutions. I'm still rocking the same resolution I had last year, which is 1512 by 982. Yes. It's classic.
Starting point is 00:58:31 Classic joke. You could have done that low-res eyes before you had your eyes fixed when you first saw a leaf for the first time in your life. And you were like, oh my gosh, there's more pixels here. That's like when I got the right of displays for the first time. We were like, wow. Yeah, it's true. That's what lights was really sharp.
Starting point is 00:58:48 Yeah. We really did. And we thought it would be fun to go through some of these. We don't have to hit all 12, but a few that we think are either right or wrong or we have comments. You know, if you have comments, sound off as I read them. How should we do this? Should I go, I'll read all 12 and then we'll stop. I'll pause.
Starting point is 00:59:06 If you guys have anything to say. Matt, say nothing. Read all 12, Jared. If not. Okay. You see I preempted that here? Matt, say nothing. Well, he just said to talk.
Starting point is 00:59:16 That's why I was going to talk. Well, it's not fun. No more talking for you for just a minute here. You prefer when you sing. Your songs are great. Yes. Commentary, you know. For the listeners' sake.
Starting point is 00:59:29 Like, if we could have a swap, if we could have a mid-show swap, you know, like, man, if you had a tag team, maybe Tom Wilkie could be your tag team. And like, Tom could talk and then you could sing. That's a good idea. Let's swap out Tom next time. No, the other person could be worth. Just right in the middle. No announcements. Back to the devil you know.
Starting point is 00:59:45 All right. So speaking of the devil, what we know. what we think we know. Tom thinks that in 2026, businesses will pay more for AI agents than people for the first time. Pausing for comments.
Starting point is 01:00:00 Oh, I thought you were going to read all 12. Oh, I thought I was going to read each one and see if you guys comment. I'll read all 12. Here I go. Go fast. Number two, 20206 becomes a record year for liquidity.
Starting point is 01:00:11 That's very exciting. Number three, vector databases resurge as essential infrastructure in the AI stack. No. Not to be confused with the atom stack. Number four,
Starting point is 01:00:21 AI models execute tasks autonomously for longer than a workday. Dang. We're working hard for their living. Number five, AI budgets receive scrutiny
Starting point is 01:00:31 for the first time. Number six, Google distances itself from competitors via breadth. breadth in AI. I'm not sure what that means. That's Breast E.
Starting point is 01:00:43 Agent observability becomes the most competitive layer of of the inference stack. Okay. Number eight, 30% of international payments are issued via stable coin by December. I don't want you to put December in there. We know the year.
Starting point is 01:00:58 Okay. Number nine, agent data access pattern, stress, and break existing databases. I can definitely see that one coming through. Number 10, the data center buildout reaches 3.5% of US GDP in 2026. And number 11, the web flips to agent first. design. Say it ain't so. Say it ain't so. Number 12 and the final prediction of Thomas Tunges is Cloudflare becomes the gatekeeper for agentic payments. There you have 12 predictions for 2026. Which ones should we talk about? Let me just say whatever his portfolio is,
Starting point is 01:01:37 I want to invest. Okay. Because these are all great predictions. Okay. You like these? Fantastic. Yeah. I'm not in my head to like, I'd say 10 out of them, 10 out of 12 for sure. and I'm not iffy on them. I just like no less. Yeah, I can't see one that I like hard disagreed with. I don't know about GDP, you know, obviously. That's like a specific number. I'm not sure about the stable coin one,
Starting point is 01:01:58 but I can see it happening. Yeah, I don't like hard disagree. I don't like number 11. The web flips to agent first design. In fact, what exactly does that mean? I'm going to say that's not true. That's the one I'm going to disagree with.
Starting point is 01:02:10 Break that down for me and for the listeners. What does that mean? To you at least. Well, when you create a website today, what do you think about? What do you design it for? We design it probably for a mobile device first and then also for a desktop, right? And then maybe you think we also need an API because we need to have programmatic access to our website. But you're human first for sure and you're probably mobile first.
Starting point is 01:02:36 Yeah. Maybe you do it all at once. But he's saying that in 2026, the web will flip to agent first design. So the first thing you're going to think of, the number one thing you think of when you start a new website first is how will an agent use this? And then comes everything else. I hope that's not true. I can see it being like sometimes the case for certain websites, for certain uses, but like for all the web, the web. He calls it the web.
Starting point is 01:03:04 I'm thinking that's an awful lot of change quickly for the web, which generally moves somewhat slow. Some pushback on that. I do concur with this thought, but the lens I would shift just slightly because I feel like every, it's almost like behind every good man is a good woman, that whole stick or phrase or whatever. It's like behind every great human who's doing great work is, or maybe in front of actually, since we're front loading this, is in front of every great human doing great work is an agent. So I almost feel like you're helping the human do better. by being agent first. So I'm sort of, I'm conflicted there because that's how I'm thinking too,
Starting point is 01:03:45 really. I think that that if this last year has told us anything, people like agents, developers like agents. I think people like agents and people are going to start using cloud code who don't even code software. They're going to be coding pros or whatever. That's where my lens is at.
Starting point is 01:04:05 I'm thinking like if you're not taking the agent in mind, then you're in the past. Like you have to think about, I'm not sure I would say quite agent first, but definitely agent as well. And almost everything, everything. Even like simple CILIs or human first, right? You have to, like if you have an error from a CLA,
Starting point is 01:04:27 like you threw a command out there that you misused a flag, have a proper error, not for just a human, but for the agent that gives them context. Hey, you meant to do this or this is what that does. Or here's documentation and you give them context. I think that's where my lens is at too. So I'm not sure if it's for the web and, you know, agent first necessarily, but you definitely have to be thinking about every new interface with agents for sure.
Starting point is 01:04:50 And hands down. I agree with that. I don't disagree with agent as well, which is what you said. I think agent first in 2026 is too fast. And also I think it's both too fast to call and I think it's too fast to do. I think agent as well makes a little sense. Matt, what do you think? I kind of get this.
Starting point is 01:05:09 for sure. Because we did, we've been through transitions like this. Mobile, you mentioned, that was one of them. It used to be web. Before that, you'd be building for desktop. And it is about how the users are interacting with it. That's always been the most important thing about technology. So, yeah, I could imagine, like, if it's like booking something, or if it's like a hotel or any of that stuff, I feel like total, that's all gone, basically. It's going to be presentation of something, like, here's what I found for you. Here are some images and things. But you want to, ultimately, you'll just say to an agent like, oh, we're thinking of, me and a few friends are thinking of going here. Can you ask them for their availability?
Starting point is 01:05:47 One of the Fridays coming up would do for me or check my calendar. It goes off and pings their agents to find out their availability and if they're interested in it, you know. And then that all happens. They just get some kind of notification and the agent just asks them a question like a text or it makes the text look like it's come from you, even more worryingly probably. And I think that those kinds of flow. I think will happen and they'll have to happen. But yeah, does this mean you're then not going to have an experience where you are choosing what to present? Agents are all at the moment you are prompting them. You have to ask a question for something. What if you don't know what to ask it?
Starting point is 01:06:26 What if you don't know what it can do? Those kinds of things, I think, is where we'll have some kind of other experience. We had the same thing with dashboards because, you know, someone said, does the assistant project mean you don't need a dashboard now? Because you can just ask the question. But there is something about having being able to go and look at something without having to ask for it and just having an easy way to go and find that. And I think dashboards still sit there. Like they're still, I don't think dashboards are going to go away, for example, but it'll be new things alongside it. It's the hub and spoke model. You always have an API first and you always have a client. It's like if you live in that API first design world,
Starting point is 01:07:08 then you totally get what's happening here because if you've always been API first and a CLA is a client, the website's a client, and an iOS app is a client, that's pretty easy to sort of grok that direction. So I get that. This kind of conflates to some degree with, you know,
Starting point is 01:07:26 Paul Dix. Jared, he mentioned trying to figure out the best way to say this, but while I was reading the great engineering divergence, one thing you mentioned was Amdahl's law, which is the principle is when you speed up part of a system, your overall speed up is limited by the parts you didn't speed up. So it almost reminds me of that in the fact that if you don't think about an agent and you let and you don't think about that first mentality like even Matt's describing here,
Starting point is 01:07:58 you don't enable yourself for the things you're building to be as fast as it can be because agents are going to be much more fast than we are because they're designed to be. They're a machine. I think if you don't start curbing that idea, you're going to be dwarfed by the folks who do and embrace these new systems that have to move faster and they retool their entire pipeline towards agentic things that move faster.
Starting point is 01:08:22 And if you stick in the old way, let's just say, you're going to be slower by nature and less fast. So slow to market, slow to think, slow to experiment, slow to fail, slow to succeed, all the things. I think it's, I got to say it. I don't like saying it necessarily, but I am
Starting point is 01:08:43 thinking, I guess, agent first, yeah. I guess I'm thinking agent first, honestly. I mean, could you imagine a service that is only an MCP server that you plug in? It doesn't have any other kind of web interface. The only interface
Starting point is 01:08:59 is through the agentic kind of chat. I don't think anybody wants to be that. I don't think any business wants to be that. I mean, nobody wants to be a utility company. And that's why they fight to be not just a utility company. They want to provide services and apps and like all that they want to be your provider, not your dump pipe. And I think that most profitable and desire to be profitable companies will will buck against that as they begin as they begin to get commoditized by agents. So I think there'll be a fight there. But yeah, as an end user, of course, we want the simplest,
Starting point is 01:09:34 easiest, cheapest thing, right? That works. And that's where I'm at with 2026. I don't think agents work yet. I think we are living in the golden age of coding and we think that everything's like a coding agent. Agents are not booking anybody's flights, anybody's hotels, anything. It's January 2026. They don't work yet. There's way too many edge cases, too many problems, too many errors. They are writing code and they are summarizing text. And maybe they're and they're helping out with your graph on stuff, but that's code. And that's kind of where it's at right now. Like, they're all just demos. And so I don't think it's going to move as fast as I should design my website, agent first today. I do think eventually you do. I just think that the timing is wrong.
Starting point is 01:10:19 Yeah. Yeah, I think the thing is the speed of changes is increasing as well, as more and more people are AI accelerated with their work. So I do think things are going to change much more quickly than they ever have before. And things have changed quickly before. But yeah, look at like cursor open code, Z, Claude code, all these AI now enabled things. A lot of their UIs are shifted so that their agent first experiences now. Where you're not, the code is kind of an afterthought that you review.
Starting point is 01:10:52 Right. Once you've, yeah, got it kind of going. It is quite interesting. I do think there are questions about what that's going to mean. But there's no. doubt in my mind that it's an enormous acceleration of human productivity and that's why I'm on the side of like yes let's use AI and let's use it for good and we need to solve all those challenges around energy and the climate and some of the bad things it can do and some of the
Starting point is 01:11:20 way that it's like quite greedy with with taking knowledge from people without kind of giving them credit and things like this there's a lot of problems with it but once you've seen like If we felt it, like building these products we're building, and also because they themselves are the same kind of thing, Grafana assistant is basically like cursor for Grafana. The productivity boost, just the amount of people that that's enabled, is significant. So it's not just a fad thing, you know?
Starting point is 01:11:51 Yeah. I also hate to say that. I agree with you directionally as well. And I think Adam does as well. My only pushback is on the fact that we are, in the perfect, we're like the Goldilocks case for agents. Yeah. Which is software and digital creation.
Starting point is 01:12:08 And like the rest of the world doesn't do that. Obviously it touches everything, but they do all kinds of other stuff. The agents have no ability to do yet. And so that will take time. I think it's coming. I think it's coming. We just look at everything through this like cursor and Claude rose-colored glasses because it's amazing for us,
Starting point is 01:12:25 but it's not amazing for most other industries yet. Yeah. Maybe law is the other one. where it's really making moves because again, that's so formal. And then everything else is just sloppy at this point. Yeah. Yeah, but we are at the forefront usually of change. Yeah, totally.
Starting point is 01:12:42 It has been the driver of it. So it makes sense that we would see it first. Yeah. And we didn't have to build it for the rest of them. That's our jobs. And the nice thing is, and the accelerating thing, like you said, is the fact that we are the builders, we're moving faster than we were before,
Starting point is 01:12:55 and so you can build faster. Okay. Another one, what else caught your guys is? eyes or ears. Yeah, the businesses pay more for AI than human labor. That was one of these predictions, isn't it? Does that mean that we become the cheap choice again? We can get our jobs back.
Starting point is 01:13:14 I don't think that's how it works. I think that will be the case. Again, of software workers. I think I've seen a lot of the budgets exploding. And I think that at a certain point, one engineer plus an engineer's equivalent of agentic coding is probably better than two engineers. I don't know exactly when that flips.
Starting point is 01:13:33 Again, is it this year or next year? But I could see it happening. Adam. I was just reading it to get more of the context of Tom's, Tom's, his perspective here. I'm going to read it just because it gives me some context. I haven't formulated a thought yet.
Starting point is 01:13:50 So this is kind of a delay. This has already happened with consumers. Way and low rides cost 30%, 31% more than Uber on average. yet demand keeps growing. Riders prefer the safety and reliability of autonomous vehicles. For rote business tasks, agents will command a similar premium
Starting point is 01:14:08 as companies factory in onboarding, recruiting, training, and management costs. Also, that's not talking about per capita. Yeah, it's just like you're going to pay, maybe that you'd pay, if you had a choice between the human doing the work and the AI doing the work, you'd pay more for the agent doing it
Starting point is 01:14:26 because of the proceed safety and reliability if that Waymo option translates there. I'm against, I'm personally, I chose the other way when I was in Phoenix. Okay. Because I wanted to try a Waymo because, of course, but I'm cheap. And so I looked at Uber and I looked at Waymo and it was like an extra $12 to go the same place. And I'm like, I'm just taking the Uber. But I can see where there's circumstances.
Starting point is 01:14:53 Yeah. For instance, if I'm sending my children with, I would maybe, I would trust a Waymo. where I wouldn't trust a random Uber driver with children, for instance. I could see where parents might prefer it. I could see where over time it becomes more safe, especially if you're in an area you're not used to, not just driving safe, but like this person could drive off in a place and whatever me. I can see that.
Starting point is 01:15:15 So while I did make the choice the opposite one time, I could certainly see where I would make the choice to pay more for the robot because I don't trust the humans. Yeah. See, that's just that right there. Yeah. Yeah. And I see, if you rate headline only, you're thinking salary swap, right? I thought budgets, yeah.
Starting point is 01:15:33 Yeah. Yeah, that's where I was torn. I was like, what is the true context here? Again, context. Yeah, I'm torn on that one. I think, you know, I think I would, in the case of a way, let's just use this as an example. I'm in the vehicle, right?
Starting point is 01:15:48 My life is literally on the line. Do I pay 12 bucks more for the assumed? And I suppose if data shows the reliability, and safety is higher over a trend of time, then I would pay more every single time because my life is literally priceless. And if I can't be here, then I can't even care about spending more money.
Starting point is 01:16:09 So in that case, I'd probably spend more every single time on a waymo if the data and it wasn't smeared or tainted in any sort of way, if the data was true and over time Uber was less safe while Waymo was more safe, every single time. I'd pay, I'd probably pay double, honestly,
Starting point is 01:16:26 if I had to, if I knew that. I was going to be more safe and point A to point B would truly happen. And I have determinism in my trip every time. Yeah, for sure. I'd pay double if it flew. If it flew. Yeah.
Starting point is 01:16:40 I don't want to pay double, Waymo. But I'm just saying like if knowing the date and the choice, then I'm going to pay more for the thing that gives me more safety and security in a time where my life is literally on the line. You know, in the case of a real business task, maybe not so much. You take it to like surgery, you know, over time. robot surgeons will operate more precisely and correctly than humans will because they don't have the margin of error. They didn't have a bad night. They aren't tired. Etc. Etc. We've all seen Prometheus, not the Prometheus that you all wrangle over there at Grafana, but the Prometheus, the movie, what was her name, Shaw? I believe her name last name or her name was something Shaw.
Starting point is 01:17:22 512, the last name, wasn't it? Maybe so. She had to hop in this thing at the end when she was giving birth of this alien and that had to patch her up. She happily got in there and was just like pushing all the buttons. Every single button. Go back to the it's all about the button episode from the beginning of 2025. That was an amazing show. Yeah. I got stuck in that moment there for a second. But she happily pushed every single button possible to have the machine help her deliver this alien baby and patch herself back up.
Starting point is 01:17:55 And then she went and conquered the mission, right? Like if that's a science fiction is kind of predictive in a way. If that's a version of our truth and our future, I mean, she's kind of already trusted the system, right? I mean, that's something I personally have said for almost 15 years now. Trust the system, but verify, right? But verify. I think in the future, if we, if the data shows a AI assisted or an AI agent surgeon is better,
Starting point is 01:18:24 I mean, I don't want to choose a machine over. a human, but in those cases, if the data shows it, then it just makes sense. It just makes sense. It doesn't. I think that is where we're going. We are increasingly going to just be doing that more and more and more, for sure. So we do have to figure out how we deal with that change, because that is an enormous change.
Starting point is 01:18:42 But I trusted the way, when I got in it in San Francisco, I trusted it immediately. And that's probably because I've been brought up on sci-fi films, Johnny Cab from Total Recur, where you get in, the guy spins around. And, yeah, he's a little robot boy, takes you on a little journey. Right. And the key thing about Waymo is you don't have to talk to anybody. You can just not talk to anyone. And that's worth $12 at least.
Starting point is 01:19:12 True. Uber actually do give you an option in the UK. I don't know if you have this, but there's an Uber comfort option. And this basically allows you to choose the music and decide whether they talk to you or not. So you do pay extra for them to not talk to you. Yeah, I saw that option in the Uber app. I didn't know that it costs extra. Maybe I just paid extra and didn't realize it.
Starting point is 01:19:32 I definitely said like, I think it's nice because sometimes you feel like talking to somebody and you're like, yeah, I'll have a conversation with a stranger. Another times you're like, no, I just gotten off an airplane. I want to sit in quiet and get to my place. And so it's cool that you could just get to pick. Like, yeah, don't talk to me. Well, the other factors you're not thinking about it. Maybe it's just not mentioned is like the smell, right?
Starting point is 01:19:49 Humans have habits and they also have. have odors and you know you can be a smoker you could be a not smoker you could prefer a certain sense in your car right uh s c e nts sense uh not just see it you know you're not supposed sense the other sense forget you all you're smart people out there listening to this podcast i'm done trying to spill on a podcast but yeah i mean how often do you get into that or you get the music right you've got all these human nature things that you're like you know what i just kind of rather avoid a human in this moment. Yeah.
Starting point is 01:20:27 That's a scary. There's comfort in just saying that. And I'm smiling very big if you're listening on the actual audio pod. I'm kind of blushing in a way because I can't believe I'm even thinking like that. Like I'd rather have a ride without a human if I had the choice. Just. Yeah. I'll say.
Starting point is 01:20:44 Because. But there's sometimes too, Matt, like you said, there's times. I'm like, I'd love to, you know, I'm down with the humans, right? But I'm also down with the non-humans because. human smell and have just things yeah all the things you know opinions have it smells music choices yeah but they'll see the other thing about the self-driving taxis is you can have the night rider experience i call it it's my new startup night rider you go to bed you get you know it's like a little hotel room that's on wheels you get in you get in yeah you sleep you wake up
Starting point is 01:21:18 in a different city you then spend the day in that city you sleep the next day you're in another city you're traveling while you sleep. It's the closest to teleportation will get probably because the EU keep ignoring all my letters about I know how to do it. This reminds me of Matt World. Isn't this how Matt World works?
Starting point is 01:21:34 Yeah, I think that was Matt World. One of your inventions was basically the nightwriter car. But they're building it. It would work. For sure. You'd feel bad asking a human to just drive for 12 hours to take you to Edinburgh. Yeah, you have to have empathy, right?
Starting point is 01:21:48 You, by nature. That's why I like Claude. Claude code better than junior dev, honestly, because it's like, I just don't have to even give you any empathy or anything. Like, I don't have to afford. There's no affordances. Well, there's some true psychology in that, too. It's not just personal preference. So don't feel bad for yourself.
Starting point is 01:22:07 So if you're listening to this, here's your escape patch, right? It's mirror neurons. So you see this a lot in married couples. As they age, they tend to dress similar, look similar. They don't literally look similar. down to the wire, you know, so to speak. But there's mannerisms that sort of merge in its mirror neurons. Or when you're around somebody,
Starting point is 01:22:29 the reason you have that empathy factor, or you begin to cry because somebody else is crying, is because your brain is literally wired to mirror neurons. It's called mirror neurons. It's a psychological fact. It is brilliant. People do end up looking similar because they'll pull the same facial expressions. Yeah.
Starting point is 01:22:46 I got this frown. I keep doing it. I hate it. Gosh. Yeah. And then so you're working, I'm always doing this face. Because my wife's always doing it. She looks better than that.
Starting point is 01:22:57 Yeah. But then my face changes. The goatee as well? She does, yeah. That's a mirror neuron for you. It's a woman. Yeah, that's when your neurons are mirroring too much. Yeah, too far.
Starting point is 01:23:09 You need to get that checked. It's too far. Yeah. Well, if you're around somebody, even, and think about this too, the next time around somebody, and you're just standing there having a conversation, if they cross their arms and moments later you cross your arms, guess what? mirror neurons. Okay, that's how it works.
Starting point is 01:23:24 You start mirroring somebody else just because that's just what we do as humans. I don't know. I describe it. That's why I keep saying Dan Tan. I mean, I never used to say Dan Tan. Dan Tan. All right, let's go Dan Tan to my next place here.
Starting point is 01:23:37 I'm going to change the subject. Let's see. There was two that stood out to me. One was the database access patterns, breaking things, and then vectors. These two stand out to me. You know,
Starting point is 01:23:49 why does this stand out to you, Matt, since you're concurring. The date, we've already seen this. We are hammering our databases now. The agents can do queries a lot faster than humans can. And it can do more complex queries and stuff. Now we're lucky, this is just the example at Grafana Labs. We're lucky because the teams, you know, that's where I work.
Starting point is 01:24:10 We are hiring. There are teams. The teams are very good. The Loki team like these are, like I've never worked with engineers with that particular kind of speciality because they, you know, they care about stories. obviously all the data formats are bespoke that they have to invent, you know, all the indexing, all the kind of complexity that they build to make these systems work really quickly.
Starting point is 01:24:30 So they're up for the challenge, but we are hammering them. You know, we had the drill down apps last year that we did where the UI is basically dedossing and we've made that even worse now with assistant. And they have to adapt and change. And they're up for the challenge, but is everyone, Are some old data techniques or data techniques even going to change? Are we going to start changing how we store data so that it's ready for agents indexed differently? Vector databases, I think, as well, plays into that big time.
Starting point is 01:25:02 This is where you create a vector from some kind of content, a chunk of content. You put it into some multidimensional space, and then you can query that very quickly. And it's kind of semantic, as long as it semantically means, like things that mean the same end up in the same area in this multidimensional space. So you know that this is roughly what you mean the nearest thing to it. Yeah, unbelievable. And that powers, that makes, cursor does this very well. The others have the same kind of thing. Indexing your code base like that, cursor makes it basically very quick. If you ask a question about the code base, it can answer it extremely quickly by just consulting its index that is built. It used to grep all the time,
Starting point is 01:25:43 and it would just take longer to go and grab everything to learn. And then it would end up filling its context window too much. Now it will use the vector database, gets the answers right there very quickly. And you really feel that difference. And I think vector databases are going to be a massive new concern for 2026. Yeah, vector databases are interesting. I'm just barely scratching the surface thinking about some of them.
Starting point is 01:26:09 But what do you do whenever, like you said, the vector space that you operate in. Like those embeddings are created by, let's just say an algorithm, maybe even an LLM or a model or something like that. And that model gets superseded, it becomes part of your architecture to kind of keep your original embeddings, maybe, or maybe the original data sets you can re-embed quickly to get maybe even better embeddings as that vector database gets used
Starting point is 01:26:38 and there's performance enhancements, there's new technologies. What do you think about that? Yeah, I think, so the vector, embeddings are different. Yeah, so we use like, there's some open source ones that you can use. There are some other, there's also like other models that do it. But they, yeah, so I could imagine there being new vector technology, which means you then want to reindex things potentially. Of course, I think it's better, sure.
Starting point is 01:27:04 But then, but the ultimate, the in and out of it to the LLM is just his, here's a sort of search query and it returns just results that match semantically. So that interface is probably quite safe to keep. But who knows, different innovations could happen. One of the problems is like clustering to... It's like caching, right? Is it like caching? Yeah. It's like a cache.
Starting point is 01:27:23 It's an index. Yeah. It's an index that... But it's an index that you can... To look it up, you're just doing simple geometry. Like, they're quite simple functions to actually find the answers. Because you do all the work at the time you generate in the embeddings and things. takes a lot of time and process to get. But reading it is very quick. Yeah. But if you've,
Starting point is 01:27:48 if you have lots of content, the more content you put in, then this, this vector space can get crowded. That's where you end up with problems where it's just, uh, it picks things that aren't relevant. You wouldn't consider it as relevant as other things, but it's all too tightly. Which is when you would do it a re-index or re-embed, right? You would re-vectorize. I'm not sure what the terminology is for these because I'm just touching it a little bit here. But, when the space gets crowded, you need to figure out a way to give distance and give more meaning.
Starting point is 01:28:17 Because the whole point is to create meaning and create similarity in the vector space, but not have to stay there forever. Yeah. So it might be like you would keep a vector of recent stuff. So you keep an index of the recent stuff that's what you're going to search because that's in this domain, that makes most sense. Yeah. But maybe you also have an older one
Starting point is 01:28:40 that contains past, things and you do you actually do multiple requests into this. What about the database? What do we do there? Just, just better indexes, more VCPUs, you know, what do you do? More RAM, dedicated machines. I think we're going to end up storing data differently so that we store it. So it's in a format that the LLM needs, which is going to be just natural language in a lot of cases. There'll be some cases, but it's that. For what we do with the assistant, just as another example, just because it is on my mind. a lot is we have a small microphone assistant yes so hot right now so hot so hot so
Starting point is 01:29:19 hot well according to a big AI company it's one of the best implementations they've ever seen which model are you guys using we use Claude sonit four five at the moment but we're excited about others when they get more affordable but what we use we use a smaller model to look at the data so um we use a so you know it'll make a a query, but we can easily fill up the context window too much. So we take the data and give it to a different model and say, we taught it how to describe a graph, basically. It's got a spike here and then it dips, or it's flat generally, and it describes it in natural language. And it's that that then gets fed back to the main agent. So the main agent, it's a bit like saying, look at that
Starting point is 01:30:04 graph for me and tell me what you see. And it might, and the agent, there's one will say. It describes it, yeah. Describe the color green for me. very spiky yeah it'll just yeah it's all right it's trees that is interesting how natural language has become the language of choice across these things like even if you like even context is simple just like it describes it in words yeah i like it because i can read it thank you yeah yeah till i can't read it right that's what i mean so that the the APIs actually like being Being just text that you can read is quite nice for humans as well. We don't really do that.
Starting point is 01:30:47 We tend to have JSON API or something. But it's open to interpretation. So, right. How we store data is going to change. And did you finish your thought? Or did I interrupt you and you didn't finish? No, no. I think that's it.
Starting point is 01:31:03 Yeah. I think vector databases are going to shine this year and we'll have to see innovation there. Is there any particular vector database or, package or module that you're using in like maybe the Go world that you want to give a shout out to? What's on your,
Starting point is 01:31:18 what's got your fancy? Well, we've, we tend to, so in the, we're basically building our own. Postgres does have the ability to store embeddings. Bege vectors.
Starting point is 01:31:27 Yeah. But you still need to decide how to generate the vectors. And that's a separate piece that you need to figure out. And that's domain, very domain specific, right?
Starting point is 01:31:36 The better you do that, the better your search results will be. Yeah. So that's, I don't know. Yeah, we tend to, I think when we did machine box, we did the machine box startup. We had the same thing because it would do it with face detection. So it would look at the face and it had a big model that was trained on loads of faces. We trimmed off the last layer.
Starting point is 01:31:56 This is kind of a spoiler alert of how we did it. So instead of it giving you the answer of a person of the face, it gives you basically the vector. And then we have the spatial index where we go and look up who the person is, which allowed us to do one-shot learning and also you could delete and forget things because it's just editing it in an index. So we tended to do that stuff ourselves. It's not that complicated,
Starting point is 01:32:20 but like doing it at scale and doing it, you know, nicely, redundantly and horizontal scale, all that stuff. You know, that's where we want services. But yeah, I think we'll see more of them coming out. I don't know loads of them. But yeah, pretty good. Interesting. So build your own is where you're at. At the moment, but also don't. I mean, yeah, we, we are doing, we tended to do that in the past.
Starting point is 01:32:50 But that's because we weren't sure exactly the use cases and we wanted flexibility to be able to innovate. So it's kind of worth us having our own thing. But I think once there's a, once, there should be a service, an open source thing. There probably is. I don't know. You're playing with Parquet, Parquet. How do you say that? How do you personally say that? I say parquet. Parquet, okay, cool. Do you play with Parquet at all? Like, do you maybe do, like you said.
Starting point is 01:33:15 Like running around on the streets and that? Jumping over bins and that. Yeah, run up the stairs backwards. Hardcore parquet. That's right. As you would say, Jerry, right. That's right. They do use that in Grafano Labs.
Starting point is 01:33:26 I haven't used it myself, but that is used. And I don't know. I don't think that's a spoiler or anything. I think that's known. Yeah, but no, I haven't, no real experience with that. Well, I have a prediction for 2026. Oh, yeah? It's short term, short term prediction.
Starting point is 01:33:44 I predict that Matt's going to get his guitar. Oh, my gosh. Oh, my gosh. How did you do that? Oh, my goodness. Oh, my goodness. I'm intuitive. Get your parquet on.
Starting point is 01:33:56 Okay. Okay. Heartcore. Happy with the key? I love this key. What is that? C minor? Dettman.
Starting point is 01:34:05 Where is, what's the name? Kul. The famous guy who's Donald. Donald. No, it's not Donald Canooth. Singer, geez. Singer.
Starting point is 01:34:23 Singer songwriter? Toby, Toby Keith. Charles. God, also way off. Charlie Puth. Oh, my bet. Okay. It's that Margaritaville guy.
Starting point is 01:34:36 I'm bistered your name. Oh, yeah? Charlie Puth, yeah. Like Neil Diamond. Pitch perfect. Canuth. according to Google, likely refers to singer Charlie Puth. That was an embedding right there.
Starting point is 01:34:48 They vectorized that because a lot of people just jacked that one up. They're like, listen, we're going to speed up this search, okay? This is pretty close in vector space. I'm going to parquet of that and then vectorize that and then boom. There you go. All right, Matt. Go ahead, Matt. Tell us what you got here.
Starting point is 01:35:01 All right. I'm cheaper than an AI now. You're going to hire me back. It turns out Chad GPT knows how to negotiate. It learned from the best lawyers and everyone on Reddit with opinions Turns out Opus 4.58 your lawyers for lunch And it was just a simple prompt and the response that they gave to the actual judge Contained emoji
Starting point is 01:35:43 Hire me back Don't give all that money to the robots Humans need money too to give some to me, please. Yeah, we've touched the whole range of AI subjects and different musical subjects. They really bring it home, man. You brought it home.
Starting point is 01:36:15 Oh, shit, I don't know what that means. Is that good? A second line was a little if you, but gee, you've rounded it off pretty good. I appreciate the immediate review. I like to fail fast, line two. I like it. like a linter. It's like error on line
Starting point is 01:36:32 too. Adam the linter, musical lintre. I'll give you three stars, but I'll let you decide if it's between five or ten. Well, I'll give you in your podcast two thumbs up. Oh, thank you. Out of ten. Oh. Oh.
Starting point is 01:36:46 I do want to give a shout to Tom, Tom, Tomaz, tongue. I'm sorry, dude. Thomas. Theoryvc.com. I really do appreciate this post. I didn't think we would have such a great, I guess, somewhat great,
Starting point is 01:37:05 mostly great in a song, a conversation from, not against venture capitalists, but like this is good. These are good predictions. Very well thought through. And so maybe theoryvc.com can be your friend, at least for some information.
Starting point is 01:37:20 Follow them on LinkedIn, maybe, who knows? Maybe Tom Ash would like to invest in Changelog. Jean-Jet. Jean-J. All day. Gerard. I started a new company doing the distributed VCS.
Starting point is 01:37:36 Put your social code in it or don't. I don't care. It's that kind of thing. I asked you not to do that. I'm sorry. I can't help it, but I can't even do your voice really. I need to meet you properly.
Starting point is 01:37:47 You need to meet us properly. Well, I need to meet you properly. It's never too late, man. As long as we're still both breathing, which so far. Yeah. So good.
Starting point is 01:37:56 So if you're not breathing, you don't want me to visit. It's never too late. I have to hold my breath when you get here. Well, that's because it's been a long flight. You wouldn't have to if you were visited by self-driving, Matt. You could sleep on the way over. You'd be refreshed when you get here.
Starting point is 01:38:12 That's true. Oh, it should have a shower in it as well. She'd press a button and it showers you. Listen, I like your idea of how we'll travel in the future. I'm down for some version of that, honestly. I love it. I think I would love to teleport via a trip. I don't mind the time because if I can use the time,
Starting point is 01:38:30 well, then I'm cool with the time, because I'm going to use time anyways. Oh, yeah. It's just, I got no choice, right? I got no choice. This guy uses his time. We don't just pause. He uses it. Yeah.
Starting point is 01:38:41 When given a choice, he uses it. But sleeping, sleeping and traveling, I do think is good. Like, have you ever fallen asleep on a long flight? Yeah, because what else you're doing? Exactly. Fall asleep on a long flight. Beautiful. I've done it.
Starting point is 01:38:51 I did it last time I went to the United States. We're working, man. I can slay some work over three hours and just be like, what? Three hours. Yeah. You could brought up. broadcast from it. Yeah.
Starting point is 01:39:02 You can broadcast from your car. One day. Just fake background. Yeah. Don't need to. So if we go agent first, Jared, agent first web leads us to this world. I like agent first travel. You know, let the agent travel for me and then let me know how I went.
Starting point is 01:39:16 Yeah. You can just email it. Just email the agent. It's an attachment. I just want to stay home. Matt, thanks so much for hanging out with us again, kicking off 26 the right way with you and your guitar and your accent. even though you kept dissing me. I still enjoyed it somehow.
Starting point is 01:39:35 It's like I'm a masochist. Yeah. Or I wasn't dissing you. Yeah. But I've had a great time. I've had a fantastic time. Welcome to 2026, everybody. What are we going to do?
Starting point is 01:39:47 Let's do something good. There's a lot of kind of trouble going on. And, you know, but there's a lot of us still, we've got time on our hands. Let's do some good stuff. Yeah. Yeah. Yeah. That's how remedy to it.
Starting point is 01:39:59 Use your time. Use your time. Use your time. Thanks Matt. It's awesome seeing you. Bye friends. Bye friends. And Matt.
Starting point is 01:40:11 All right. That's changelog and friends for you. And that's Matt Ryer for you. He's truly one of a kind. Oh, and we do have a little change log plus bonus at the end of this one in which Matt tries and fails to give us a Grafana assistant demo. No, 100%. How do I share? I'm clicking the share button and nothing is happening.
Starting point is 01:40:29 Thanks again to our partners at Flightoio. to our sponsors of this episode, namespace.s.o, notion.com slash changelog, and tigerdata.com. Thanks also to BMC for the continuous stream of dope beats and to you for listening. We appreciate you.
Starting point is 01:40:44 That's all for now, but let's get back together and talk again real soon. Can you share more about the Grafana assistant architecture? He would love to. Yeah.
Starting point is 01:41:16 How much can you share and what can you share? I'm really curious. I'll tell you anything. Yeah. Ask him anything. I'll get fired for this show. ChangeLog Plus Plus. It's better.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.