The Infra Pod - Infra Pod 2025: Our Favorite Moments, Hottest Takes, and What’s Next

Episode Date: December 29, 2025

Join Tim from Essence VC and Ian Livingston from Keycard for the year-end 2025 recap of Infra Pod! In this special episode, Tim and Ian reflect on their favorite moments, hottest takes, and biggest le...ssons from a year of rapid change in infrastructure, AI, and agent technology.They revisit standout episodes—like deep dives into browser automation, the evolving role of memory in LLMs, and the disruptive potential of agent sandboxes. The hosts discuss how companies are pivoting in the AI era, the importance of adapting quickly, and the surprising ways hardware choices are shaping the future of compute.Looking ahead, Tim and Ian share bold predictions for 2026, debate the next big abstractions in infrastructure, and invite listeners to share their own hot takes and favorite episodes. Whether you’re an engineer, founder, or just passionate about the future of tech, this episode is packed with insights, energy, and a look at what’s next for the Infra Pod community.

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to InfraPod. This is Tim from S&VC and Ian, let's go. Hey, this is Ian Livingston, CEO, co-founder and lover of agentic identity and trying to make it easy if you do adopt and build safe MCP apps and tools with keycard. Couldn't be more excited for this episode, Tim. It's just two of us. There are no guests to introduce this time. Yes.
Starting point is 00:00:28 It's the end-of-year recap. You know, our tradition. I know. I know. Yeah. This is, it's been funny here. And as you know, we've been always more casual style chatters. And we just want to go through with all of you of what happened this year. I'm sure most of you, if you listen to our episodes, we actually will love to get your takes, anybody listening.
Starting point is 00:00:50 What is your favorites? We're going to share ours in this pod. But, you know, afterwards, I actually love to hear all of yours. So, Ian and I actually will go over some. the interesting things we took away, right, and share our learnings and roll with it. Awesome. Couldn't be more excited. You know, also, the one thing I'll say, this is our third recap episode.
Starting point is 00:01:11 So that means that we've been doing this for three years, Tim, which is pretty incredible. We're that old. They say when you get past 20 episodes, you're a thing, and I think, I don't know what we're at. We must be like 50 or something, 60. I don't know. Anyways, we're quite far into this, but. Yeah. I mean, this year alone is 23 episodes, right?
Starting point is 00:01:28 So, yeah, we're rolling like nothing else. Wow. But that's why you're having fun. You don't even remember all the things that happen in the middle, you know? Exactly. So, cool. I think we're going to go through, I think the first thing you want to go through is what is our favorite episode of the year. So, Ian, I'll let you go first, man.
Starting point is 00:01:45 Hands down. Actually, it's tied. It's a tide. I don't have one favorite. I have two. And one of them was Paul's Browser Base and really trying to understand like why browsers were important for this sort of nexagentic wave. I mean, Paul's very, you know, first and foremost, a great talker in our circles. We might say a great yapper, but also just like understanding the why and why that was like, you know, old browser automation, old problem, but new technology, new opportunity, new need, completely changes the game end to end.
Starting point is 00:02:18 And then I'd say the other one that is tied with for me is with Charles from Packer when he talked about memory. Ah, from Letta. Yeah, exactly. And that was really interesting. And I've been thinking a lot about the consequences of memory ever since, right? Which is, holy crap. We had this, like, model with data. And we've always had this sort of property around data, which is, like, they meant
Starting point is 00:02:37 the data leaves the server or goes to the client. It's kind of like, you know, no longer of control over it. But now we have this thing that can aggregate and search and summarize and conclude these facts and these pieces of information get intertwined directly into the app and the experience. They directly impact the experience people have. And it's this very interesting problem that is at the core, going to be at the core of like all agentic interactions.
Starting point is 00:03:00 And so I think those two at the, and are like my most favorite episodes of year. I mean, what were yours? Yeah. And what's amazing for those is they kind of really like change your thinking a little bit. And for me, probably like I told you earlier, the one I can remember the most recent one that kind of like changed my mind, not completely, but kind of reminded me is I've been from Daytona, you know, I think that one. where we talk about, like, agents will take over, like, Amazon, right?
Starting point is 00:03:28 And it made me really realize and think, like, obviously, we know agents are going to run a lot more of them. And we're so early in the agent's space and adoption, so it didn't feel like it's going to take over. But with the race, it's happening. Actually, it starts to make you realize more. Like, it can actually take over more than humans quickly, and our space is going to completely change.
Starting point is 00:03:49 And when I thought of a sandbox, it can definitely feel like more of, like, a feature. but we really, really, like, zoom out. It starts to realize, like, the universe is so large with agents. There's going to be a really big shift of software and needs to be delivered. And so I really like the way he started off, like, basically Amazon hyperscalers are not going to be here. He has to believe that. But just putting it out there, it definitely makes it very interesting as a discussion because you start to actually believe it could happen, right?
Starting point is 00:04:19 It never thought Amazon would be, like, eaten alive because they've done so many. things at the bottom layer. But I do think, you know, being in our positions, we're seeing how, no matter how much work you've done in a software infrastructure way, there is a way to disrupt this. And it doesn't really take decades. You know, it can actually be pretty quick to actually disrupt your layers beneath you. So I like that episode because it made me think much bigger of what a sandbox is, right? And why will people just kind of run these and use it everywhere besides just coding right and so that kind of like you made me really like had such a fun discussion and he's he's a great guy to talk to you know he's like all
Starting point is 00:05:03 these energy and everything so like it just makes me really more fun to do it yeah yeah i love that i love that episode too you know i think across the board a takeaway from that is just like we have all these interesting new magentic infra things that are going on that are like old and for problems but like now are really interesting again right like cloud sandboxes to like 10 boxes for agents, that's a pivot. I mean, that's where Daytona started, and that's where they ended up, is like way more interesting, relevant, and important as a result.
Starting point is 00:05:33 But it was like best practice, requirement, even realized the value. Yeah, and just maybe one tidbit. I feel like there's a lot of companies that started on a pre-AI days, right? Started with their products and kind of started with things of just pure developers. But Daytona was like such an interesting example
Starting point is 00:05:49 because they're not just pivoted, but they also kind of have to change the way of thinking. because when we're actually working this for agents, even though developer are building the agents, the primitives and isolation and everything needed for agents are actually drastically different in Cloud Sandboxes. And so pivoting is never easy because a lot of people are just like,
Starting point is 00:06:06 just change our website, change our product, and say, just say agents. But you can't just do that, right? You have to really able to adapt quick. And adapting quick in the AI world is like the necessary skill right now. So it's really amazing. And I think like talking in depth of details
Starting point is 00:06:21 what is required to build an agent in sandbox and that stuff is really interesting. All right. I think favorites are a little hard when it come to episodes, but we want to get even more further, Ian, what's your favorite hot take? Because, you know, remember everybody, we ask everybody that's on our pod, what's your hot take, you know, and we've seen some good ones, some okay ones, but what is your favorite?
Starting point is 00:06:47 I mean, I think my most favorite of them all is that memory is going to be more valuable in the LLM. And I think it's true, right? And if you think about, like, as memory gets embedded into agents and agentic apps, that memory is going to attain so many organizational facts, history, it's going to basically be the memory of the organization. And that is, at the end of the day, your operating system, your decision-making framework, your historical legacy. And I think about that when we were talking to Charles Aletta, and I think about it, I was like, you know, that is actually incredibly spicy. And I think it's important because it demonstrates what is actually going to be important five to ten years from now as a result of what's happening with agents.
Starting point is 00:07:30 Yeah. When he said that, it was actually really interesting because I think, you know, even though I'm investor in Leta and we believe in it. But it wasn't really even worded that way. That memory is going to be super valuable than for LMs himself. But I think we're seeing that more and more true now. Memory is actually one of the biggest unlock when it comes to like the quality of LLMs and general. rule. And we don't have even have a good abstraction yet. Like this markdown files, I think were nowhere close to be like the de facto thing that memory is. So that's a really, it's one of my
Starting point is 00:08:02 favorites as well. My favorite hot take, I think this year, you know, just to be a little bit different. I actually thought the whole gaming PCs are used for like CI from Blacksmith was really interesting. I actually didn't realize before talking to Plexpid, they're using gaming hardware and Hedstner's and just running this in a more efficient way. It made sense to the end, but that choice alone was so interesting because it's not common. And what I like about it is I think when everybody's so focused on the AI side of the world, they went even deeper into the hardware side. Like, we got to have AI abstractions. We got to make sure, like, we can run your sandbox or, like, your CI is a much more efficient way for
Starting point is 00:08:51 AI and agents. But instead of just trying to make the AI interfaces or have like MCP or any LLM type of integrations, go down deeper and look at the hardware we're using and look at all the abstractions we're doing it down there. Like, we have to rebuild Amazon ourselves in some level because we don't run your infrastructure in a very convenient way. But that give us the tradeoffs of actually much more efficient compute. Great. And that whole thing that he did was fascinating because I don't think that's a normal thing to do.
Starting point is 00:09:24 And it also reminds our railway a little bit, back to it. We chat with this year because Jake also made a conscious choice of making run their hardware as well. And trying to bundling that design and that small teams, able to actually look at their own hardware servers and design it from scratch.
Starting point is 00:09:40 And as an engineer, I always appreciate people doing that, you know? like people are actually willing to go down to the guts I'm going to run my own hardware I'm going to optimize everything and find a way to make it much more efficient than ever and take on that complexity and so that's what I like about it
Starting point is 00:09:57 I think like that was something I wasn't even thinking about gaming PCs run CI and suddenly become like a thing to able to save a huge cost so that was probably my favorite hot take I think it's just true of like a trend right I think one of the things we said it's like maybe AWS is over That'd be a crazy hot take, right?
Starting point is 00:10:14 Like modernizing AWS or G-Sphere Azure to be like agent-native can be really hard. So you have all these like neoclouds, a good example, like called a neocloud anymore, but, I mean, very large company at this point. But are building and deploying their own infrastructure, they're building, have their building to employing on capital flywheel to like fund that infrastructure creation. And they're driving a net new boat. And over time, as they get the peering and as they get the infrastructure build out, they're going to be able to compete with these hyperscalers. There's no way that, you know, I think five years ago, we all would have thought was, like, impossible, right? And so one of, I think, you know, your margin is my opportunity is basically what these folks are saying to AWS, along with your bad TX is our opportunity. So if you think of all of these neocloud was for sell or Cloudflare, not all of them are building their infrastructure, but most are, is they're basically looking for margin opportunities and experience opportunities to gain leverage on basically what we would all call.
Starting point is 00:11:13 in economic circles, like pent up a latent demand. And the best thing that could happen to the cloud world is a broadening out of a lot of these concepts. Yeah, it's so cool. It's so cool to see that everybody's doing actually different choices, even if we're all in the AI bubble, right? Let's talk about our last thing here. What's your prediction of next year?
Starting point is 00:11:33 What do you think will happen? You know, this is the hypest of hype we've ever seen, I think probably in our careers, you know, we worked for some other hypes before, but I don't know. I feel like big data, cloud, or even crypto, it's like, there's always like a group of people believe and a bunch of people speculate, but at this point, like, AI is so insanely crazy. So what is your prediction that will happen next year? Yeah, it's a really good question. It's a really good question. And so, I mean, I think macro-economically will probably have near the back half next year, maybe a little bit of a slump.
Starting point is 00:12:07 Like at some point, you know, all this experimental budget, we're going to say, here's what, works and here's what doesn't. And we're still very much in that like phase, which is you have the hype cycle, which is like what works and on what timeline can we make it work, right? We don't really know yet. Agentic coding is a good example, right? It went from, you know, GitHub co-pilot, which is fancier autopilot, which is fancier auto-complete. It's the point now we're at agents where, you know, they can actually be very standalone, take on certain size types of tasks with the proper prep and actually autonomously do it. And so like agentic coding is clearly a use case where we're seeing a huge amounts of ROI and huge amounts of investment. We're also seeing like
Starting point is 00:12:45 lots of interesting things like chat CVT, both like consumer but also the enterprise. So like we're starting to what we need to see is a broadening out of use cases that work. And we're starting to see them. I mean, certainly like audio video like is pretty incredible what's going on with all of the image gen and all the video interpretation, the video generation. You know, someone ends up on YouTube and you get it called AI slop or whatever. But also at the same time as driving huge amounts of productive energy generation and productive ROI for companies of all size across the board. So, like, yeah, it's hypey, but like, if you look at it, it's real at the same time. And so it's a question of, you know, this isn't a bubble built on nothing. It is certainly a
Starting point is 00:13:28 hype train and there's certainly, you know, valuation certainly elevated. But the tech is real, right? And the promise is real, too, is just on what time horizon. And I just don't think we have a full answer to that question yet. Yeah, yeah, we don't. I wish I do, but yeah, we don't have any answer for that. Yeah, my prediction for next year, I feel like the market doesn't truly know what we all want yet, right? And so, you know, I live in a world where, like, I'm talking to engineers and products, people, and some level will also talk to the capital and the VC world, and this world is all crazy right now.
Starting point is 00:14:06 So my prediction, I think, at least in the Infra side, I do feel like we need to actually understand how to really figure out what is the right abstractions that people actually can stay, which is very difficult. Because right now, I think AI is, like I said, it's so much experimental budgets and experimental usage. But the impact of the EU is so real, which means we got to like standardize a lot of operations here. Like we actually do need to know what we should invest in. What should we do? should we run our LM should we just trust all these things should we build our Jinta interfaces
Starting point is 00:14:40 should we invest in the frameworks so we not use this so we don't do that I don't think infra in today's world people can kind of figure out over time, just take your time and figure it out
Starting point is 00:14:52 because if you need AI advantage quick and we don't know what AI even going to do and all the infrastructure layers completely influx right now is not even clear what things to do or not do.
Starting point is 00:15:08 I like how Anurak and Render kind of talk about the taste side of things. Because I don't think there's never been like really a true scientific way to find, you know, where the industry really goes. I feel like every trend line of anything end up everybody does is somebody setting up a standard of how best practice kind of looks like, how Google set of SRE back in a day or scrum and agile or, you know, or even Docker back in. a day, right? These tools are all existing primitives, but they bundled an abstraction that really take over the world. So I feel like, you know, next year, my prediction, we might find
Starting point is 00:15:48 a complete new abstraction that we never even seen or heard of yet that just took over. Because the number of companies that grow so fast so quickly, you know, I don't know how many vibe coding platforms are just spun up last quarter and suddenly are growing like number one open router or something like that. That chart open makes no difference these days. Like that's, it's almost like the top 30 billboard chart on music. And the music is like churning like every quarter. So I think the LLM qualities are kind of like all over the place.
Starting point is 00:16:20 There's going to be a new people, new, new, new dethrone open source model. There's going to be a new dethrone, you know, Gemini gets a little better. This one gets a little better. So I think at this point, like infrastructure, I feel like we still have a lot of, a room to experiment still, even next year, where like, can coding change? Can programming, building products change over time? Do we actually run a lot of agents? And does an agent just do one thing? Or do we all, how does human interact with agents moving forward? Somebody can set that standard here. Because I think, like, legacy companies can't really change your way of doing
Starting point is 00:16:57 things. The current new startups have a bit or more chance to do it. But if the spaces move is so fast, I can probably argue like the newer set of players that I've been starting off even more recent probably haven't had an even better time to experiment with what is a new way
Starting point is 00:17:13 to actually use a genetic way to run everything because whoever figured that out much more efficiently as a whole company will have the massive advantage that should be ever seen before but we're still in this
Starting point is 00:17:24 like tier and tour area we're like you know let's just make our coding a little faster right everything else is the same and newer startups today is like
Starting point is 00:17:32 okay coding a little fast But everything else is a little faster. But I think where we're going with this world is it has to be a much more AI-native way of doing business right now. And we don't know what that looks like. And so I think infrastructure will kind of change with it that, hey, eventually, everybody designed, work with customers, you know, and find ways and sell and everything is completely done in a different way. I think infrastructure evolves with it. Because infrastructure is really hard to build. We don't know what a user's need is.
Starting point is 00:18:00 You can't just go invent your applications on top. Right. So we need that to be bundled. And so I do think like next year, maybe we'll find another crazy startup with like, that's like 10 people going to another trillion dollar revenue or whatever or something crazy again. But it's not just done, not just keen made, you know, in a VC world. And it's actually, it's able to grow because we're fundamentally not running the same stuff before. And I think that's kind of what the exciting part is infrastructure to me is no longer just compute and storage. It's actually how do you actually realize that whole thing. on top for your whole company and whole product line to actually work. And I feel like we don't
Starting point is 00:18:38 know what the answer is. You know, we might find some big surprise next year is my guess. Yeah, I think that's a really good take. I think about it a lot. You know, the people who start a company for a year from now are going to have no foundation so they can start with the best and breed. And you even see that with companies that started a year ago or two years ago at the very beginning. And as all these things have come out and you have all these new affordances like MCP or whatever, they weren't able to take advantage of it. And then they were skeptical and it's hard to prioritize on the roadmap. And on the flip side, you have, you know, their internal infrastructure at the same, right?
Starting point is 00:19:12 So it's just the most insane acceleration curve I've ever witnessed. And we are truly living, like, life on exponential here that's driven by the fundamental foundational capability of these models. And that is to say, like, and the other component of it is like the cost curve is coming down, right? We used to spend a lot of time talking about the cost or the run time or the delay and the improvements we've had in just foundational infrastructure to make running inference cheaper. And that's going to continue to happen. And then like reducing the delay to time to result and then all the different techniques is really driven like a huge, huge impact and sort of like the fan out of what these things can do. And then I think the other thing is like we're not even there yet.
Starting point is 00:19:57 But, like, we are starting to see model specialization, right? Like, Cursor has a specialized model for parts of the Cursor app. That is pretty amazing. You know, that is not something that we talked about. Maybe last year, a lot of people be like, oh, it's just going to be these foundation models, nothing else. Don't need fine-tuning. You don't need transfer learning. But what are we learning?
Starting point is 00:20:14 Well, actually, all of that stuff does matter. But you have to get a use case that works first before it's worth, like, investing in fine-tuning and figuring out where fine-tuning or these transfer-learning techniques or any other things can be applied and where you can start to do socialization. And if you think about any economic or industrial wave, this is exactly what happened when we went to the factory. You know, we had assembly line of people in a conveyor belt. And over time, you replaced those people with specialized machines and they gave people specialized jobs. And over time, things became more armated, more abstracted. And we had higher and higher output. The same thing in farming, right?
Starting point is 00:20:45 And so we're very early into what this all looks like. But I would say, like, if you aren't bullish now and you're skeptical, I think you have to look over the last few years of progress. or three years of progress and really it's hard to come up with say oh we can't like these problems are unsolvable there are some set of category of problems here that are very hard to solve so like prompt injection and security that is incredibly difficult problem to solve right you have a non deterministic thing and a non-terministic thing and you have unvalidated inputs and outputs and do you have this like system that's come bringing all these different inputs invalidate inputs and outputs and non-determinism combining with a reasoning engine that is actually
Starting point is 00:21:23 capable of making some level of decision that's a hard problem to solve So things like that will probably take longer first to solve than actually making the models good enough in that scale. Yeah, yeah. And I think this is where infrastructure, like, depending on, like, if you're listening to here and you're working as engineer, working on products, using code or cursor or anything like that every day, versus maybe a founder just building infrastructure layer business, I think we're in the craziest time ever. So we'll love to hear your hot take, you know. Maybe we've listened to episodes. What's your favorite episode of yours, Mr. or Mrs. Listener here? And whatever your hot take is and your prediction, we'll love to hear it. So please go on whatever social media you're on, LinkedIn X, whatever. Tag us.
Starting point is 00:22:09 We'll love to hear yours and we'll share whatever we see a hot take of yours. We'll definitely share it on our socials. But yeah, I think next year it's going to be wild. We're here to let's interview a lot more guests. We actually have a probably pretty fun group of people already in touch with. So definitely, if you have any suggestions, anything like that, let us know. We're stoked, man. This is a fun thing to do.
Starting point is 00:22:33 This is a fun conversation we had this year. And I really feel like it's all pretty worthwhile. We've had so many good conversations, including this one. We have so many great people. And I think we all, Tim and I do this because we love to do it. We love to learn. We love to talk about it. And we're so happy to have all you listen to us, Yap.
Starting point is 00:22:50 And we will continue to Yap into 2026 and beyond. Yeah. Well, I guess that's up for our. really wrapped episode here. You know, let's peace out 2025 and let let H.E.I get closer next year and I hope everybody stay safe in the mid-timer.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.