The Changelog: Software Development, Open Source - Clawdbot triggers a run on Mac Minis (News)

Episode Date: January 26, 2026

Clawdbot drives Mac Mini sales, Swizec Teller on the future of software engineering being SRE, Daniel Stenberg decided to end curl's bug bounty program, zerobrew takes some of the best ideas from uv a...nd applies them to Homebrew, and Phil Eaton on LLMs and your career.

Transcript
Discussion (0)
Starting point is 00:00:00 What up, nerds? I'm Jared, and this is ChangeLog News for the week of Monday, January 26th, 2020. This year's Northern Lights show went on tour and has even been visible as far south as California. We got a peek at it from our back deck somewhere in middle America, but our experience paled in comparison to airline pilot Matt Milnick, who captured some astounding shots from 37,000 feet on his route from Calgary to London. Links to Matt's photos are in the newsletter. Okay, let's get into this week's news. Claudebot triggers a run on Mac minis. Clodbot, an open source personal AI assistant that runs on your own hardware, has many developers excited. Quote,
Starting point is 00:00:52 developers aren't just impressed. They're calling it an iPhone moment, comparing it to early AGI and in some cases, letting it run their entire companies. End quote. What's all the excitement about? Quote, given the right permissions, Claudebot can browse the web, execute terminal commands, write and run scripts, manage your email, check your calendar, and interact with any software on your machine. Perhaps the most compelling feature is that Claudebot is self-improving until you want a new
Starting point is 00:01:19 capability and it can often write its own skill to make it happen. End quote. Yes, but can it blend? Okay, okay, where do all the Mac minis enter the story? Quote, while Claudebot can run on any computer, Mac minis have emerged as the preferred choice and for good reasons that go beyond Apple fandom. Apple Silicon's unified memory architecture is exceptionally efficient for AI workloads.
Starting point is 00:01:41 Instead of the CPU and GPU communicating over a slower connection, the memory sits directly on the chip package. This means the full memory bandwidth is instantly available to AI models, making local inference significantly faster than on traditional, 86 systems with equivalent specs, end quote. Click through for more hardware options, accounts of wild things people are doing with this, and a word of caution to those going all in. The future of software engineering is SRE.
Starting point is 00:02:10 Here's Swizzik Teller. Quote, when code gets cheap, operational excellence wins. Anyone can build a Greenfield demo, but it takes engineering to run a service, end quote. Hard to disagree with that, but I'd augment it by adding that running services is also, getting easier at the same time. Still, he has a great point. One compounding factor in the agenic AI hype is that it opens the fun part of software development to a whole new set of people. Exciting stuff? Yes. Sustainable software? Hardly. Quote, good software is invisible. And that takes work, a lot of work, because the first 90% to get a working demo is easy. It's the other 190% that matters.
Starting point is 00:02:51 The end of the curl bug bounty. Here's curl creator Daniel Stenberg. Quote, there is no longer a Curl Bug Bounty Program, it officially stops on January 31st, 2026, end quote. Despite some excesses along the way, with 87 confirmed Volns and over 100,000 USD paid out, Curl's bug bounty program unfortunately fell prey to AI slopsters trying to make an easy buck. That's not the only factor, though. Daniel says three bad trends combined to make them take this step. Quote, the mind-numbing AI slop? Humans doing worse than ever.
Starting point is 00:03:24 and the apparent will to poke holes rather than to help. End quote. In other sad news, last night, Curl ended its historic NFL playoff run too, as the Seattle Seahawks beat the LA Rams. It's now time for sponsored news. The top nine Postgres extensions that Tiger Data customers use. You're running Postgres, Pine Cone, and TimescaleDB. What if I told you that's two databases too many?
Starting point is 00:03:50 Tiger Data analyzed tens of thousands of databases to find the nine extensions. their customers actually use, and the results are a love letter to Postgres extensibility. The top extensions their customers actually use are, timescale DB for time series, PG vector and PG vector scale for embeddings, post-GIS for geo, PGAI for calling LLMs directly from SQL. You want some data? Well, PGVector scale benchmarks at 28X lower latency and 16x higher throughput than Pinecone, at 75% less cost, and timescale DB, your 1 terabyte of time series data compresses down.
Starting point is 00:04:24 to 100 gigabytes. Instead of duct-taping, three specialized databases together, you can extend the one you already know. No new query language, no data sync nightmares, no vendor lock-in. This opens the door for teams to stop context switching between databases and start shipping features. Learn more about the top nine Postgres extensions and what timescale DB has to offer at tigurdata.com slash blog. Zero brew applies UV's model to Mac packages. Zero brew takes some of the best ideas from UV and applies them to HomeBrew. Quote, packages live in a content-addressable store, so reinstalls are instant.
Starting point is 00:05:00 Downloads, extraction, and linking run in parallel with aggressive HTTP caching. It pulls from HomeBrew CDN, so you can swap brew for ZB with your existing commands. This leads to dramatic speedups, up to 5x cold and 20X warm, end quote. This is all quite experimental at the moment, but it appears to be picking up Steam.
Starting point is 00:05:18 I also appreciate the author's approach to LLMs, quote, I spent a lot of time thinking through this architecture, testing, and debugging. I also use Claude Opus 4.5 to write much of the code here. I'm a big believer in language models for coding, especially when they are given a precise spec and work with human input. LLMs in your career. Here's Phil Eton.
Starting point is 00:05:39 Quote, The jobs that were dependent on fundamentals of software aren't going to stop being dependent on fundamentals of software. And if more non-developers are using LLMs, it's going to mean all the more stress on tools and applications. and systems that rely on fundamentals of software. All of this is to say that if you like doing software development, I don't think interesting software development jobs are going to go away.
Starting point is 00:06:00 So keep learning and keep building compilers and databases and operating systems and keep looking for companies that have compiler and database and operating system products or companies with other sorts of interesting problems where fundamentals matter due to their scale. That's the news for now. But go and subscribe to the change log newsletter for the full scoop of links worth clicking on, such as running clog code dangerously, safely, things I've learned in my 10 years as an engineering manager,
Starting point is 00:06:28 and after two years of vibe coding, I'm back to writing by hand. Get in on the newsletter at changelog.news. Have yourself a great week. Like, subscribe, and leave us a five-star review if you like the show. And I'll talk to you again real soon.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.