The Changelog: Software Development, Open Source - The smell of vibe coding (News)

Episode Date: August 4, 2025

Alex Kondov knows when you've been vibe coding. (He can smell it.) our friends at Charm release a Go-based AI coding agent as a TUI, Jan Kammerath disassembled the "hacked' Tea service's Android app, ...Alex Ellman made a website that provides up-to-date pricing info for major LLM APIs, and Steph Ango suggests remote teams have "ramblings" channels.

Transcript
Discussion (0)
Starting point is 00:00:00 What's up nerds? I'm Jared and this is ChangeLog News for the week of Monday, August 4th, 2025. Can I ask you a personal question? Oops, I just did. When was the last time you had a good cry? You know there's evidence that crying can relieve stress for a whole week. That's why Johnny Moroni made a website that makes you cry by showing tear inducing videos.
Starting point is 00:00:32 Go ahead, click to feel something. All dried out? Okay, let's get into the news. The smell of vibe coding. Alex Kondov knows when you've been vibe coding, he can smell it. Quote, no one would write an HTTP fetching implementation covering all edge cases when we have a data fetching
Starting point is 00:00:52 library in the project that already does that. No one would implement a bunch of utility functions that we already have in a different module. No one would change a global configuration when there's a mechanism to do it on a module level. No one would write a class when we're using a functional approach everywhere." See, he's on to you.
Starting point is 00:01:11 Ultimately, Alex doesn't care about how the code got into your IDE. He just wants you to care, to care about quality, to care about consistency, to care about the long-term effects of your work. And if you care, don't leave a code base's maintainability to the weights of a model.
Starting point is 00:01:27 A glam AI coding agent for your terminal. Our friends at Charm are at it again. This time they've built a Go-based AI coding agent as a TUI. It's called Crush and it's multi-model so you can choose from a wide range of LLMs or add your own. It's flexible so you can switch LLMs mid-session while preserving context.
Starting point is 00:01:48 It's session-based so you can maintain multiple work sessions and contexts per project and it's extensible so you can add capabilities via MCPs. Disassembling the T-App Hack. There's been a lot of speculation and a whole lot of jokes. That T, an app used by women to dish about men on dating apps, was vibe coded because of just how epically and easily
Starting point is 00:02:12 its data got leaked. Jen Kamarath disassembled the Android apps binary and concludes, not so. Quote, my assumptions after this initial forensic analysis is that this app was built by a single inexperienced developer or by a team dictated by a single inexperienced developer. The app was likely not vibe-coded as none of the models of the past months would have made such obvious mistakes."
Starting point is 00:02:37 Ouch. In other words, vibe-coding would have produced better results. This hack should have never happened. The T app is an AI slop. It's gross negligence from a likely single developer with very little experience that should not have been allowed to publish such an application without supervision.
Starting point is 00:02:57 The app didn't get hacked. It willingly published sensitive, personally identifiable information to the world. It's now time for sponsored news. Observability for your GitHub actions. If you ever stared at a failing GitHub action wondering what on earth just happened, you are not alone. CI logs are often a black box and not the fun kind.
Starting point is 00:03:19 That's why Depot just launched GitHub Job Details, a new observability layer for your CI CD pipeline. It lets you zoom in on what each GitHub actions job is actually doing, real build times, dependency fetches, computer performance and more. No more guessing, no more click-fests through the raw logs. Your team gets better visibility into bottlenecks, misconfigured jobs or flaky performance
Starting point is 00:03:42 without rerunning things a dozen times. It's observability built for CI, not just another dashboard. Check it out at depot.dev and thank you to our friends at Depot for sponsoring Change.log news. Price per token. Is your LLM spending budget on the rise? If so, Alex Elman has just the website for you.
Starting point is 00:04:03 Price per token provides up-to-date pricing info for major LLM APIs, including OpenAI, Anthropic, Google, and more. It pulls data from openrouter.ai and even lets you estimate the cost of executing the same prompt across different models. Plus, they just added image generation comparisons. In the future, I assume every AI tool
Starting point is 00:04:25 will have a smart routing layer upfront that dissects our prompts, determines the sweet spot provider of just enough quality at the cheapest possible price, and dynamically routes it on our behalf. In the meantime, tools like this one from Alex could save you a bundle. If you're remote, ramble.
Starting point is 00:04:43 Steph Ango of Obsidian fame makes a suggestion for remote teams of two to 10 people like his. Quote, create a personal ramblings channel for each teammate in your team's chat app of choice. Ramblings channels, let everyone share what's on their mind without cluttering group channels. Think of them as personal journals or microblogs inside your team's chat app. A lightweight way to add ambient social cohesion." These channels should be at the
Starting point is 00:05:11 bottom of the list, muted by default, with no expectation that anybody ever reads them. The Obsidian team has found them surprisingly sticky. Because they are so free and loose, some of our best ideas emerge from ramblings. They are often the source of feature ideas, small prototypes, and creative solutions to long-standing problems. That is the news for now, but go and subscribe to the changelog newsletter for the full scoop of links worth clicking on. Such as, most language migrations are hype-driven, Most language migrations are hype driven. Modern Node.js patterns. And typed languages are a better fit for vibe coding.
Starting point is 00:05:50 Get in on that newsletter at changelog.news. In case you missed it, last week on the pod we hosted Greg Osuri on solving the AI energy crisis. That episode has already generated a lot of discussion in Zulip. And on change talking friends, Adam and I sat down with the 2025 Stack Overflow Developer Survey results to glean what we could glean. Give those a listen, scroll up in your feed or down, I guess, depending on your sort order. We have some great ones coming up this week too. Both of our Denver live shows are shipping. Our interview with Nora Jones comes out on Wednesday and on Friday, Kaizen 20 with Gerhard Lazu.
Starting point is 00:06:28 Have yourself a great week. Like, subscribe, and 5 star review us if you dig the show, and I'll talk to you again real soon.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.