The Changelog: Software Development, Open Source - The smell of vibe coding (News)
Episode Date: August 4, 2025Alex Kondov knows when you've been vibe coding. (He can smell it.) our friends at Charm release a Go-based AI coding agent as a TUI, Jan Kammerath disassembled the "hacked' Tea service's Android app, ...Alex Ellman made a website that provides up-to-date pricing info for major LLM APIs, and Steph Ango suggests remote teams have "ramblings" channels.
Transcript
Discussion (0)
What's up nerds?
I'm Jared and this is ChangeLog News for the week of Monday, August 4th, 2025.
Can I ask you a personal question?
Oops, I just did.
When was the last time you had a good cry?
You know there's evidence that crying can relieve stress for a whole week.
That's why Johnny Moroni made a website that makes you cry
by showing tear inducing videos.
Go ahead, click to feel something.
All dried out?
Okay, let's get into the news.
The smell of vibe coding.
Alex Kondov knows when you've been vibe coding,
he can smell it.
Quote, no one would write an HTTP fetching implementation
covering all edge cases when we have a data fetching
library in the project that already does that.
No one would implement a bunch of utility functions
that we already have in a different module.
No one would change a global configuration
when there's a mechanism to do it on a module level.
No one would write a class
when we're using a functional approach everywhere."
See, he's on to you.
Ultimately, Alex doesn't care
about how the code got into your IDE.
He just wants you to care, to care about quality,
to care about consistency,
to care about the long-term effects of your work.
And if you care,
don't leave a code base's maintainability
to the weights of a model.
A glam AI coding agent for your terminal.
Our friends at Charm are at it again.
This time they've built a Go-based AI coding agent
as a TUI.
It's called Crush and it's multi-model
so you can choose from a wide range of LLMs or add your own.
It's flexible so you can switch LLMs mid-session
while preserving context.
It's session-based so you can maintain
multiple work sessions and contexts per project
and it's extensible so you can add capabilities via MCPs.
Disassembling the T-App Hack.
There's been a lot of speculation and a whole lot of jokes.
That T, an app used by women to dish about men
on dating apps, was vibe coded
because of just how epically and easily
its data got leaked.
Jen Kamarath disassembled the Android apps binary
and concludes, not so.
Quote, my assumptions after this initial forensic analysis
is that this app was built by a single
inexperienced developer or by a team dictated by a single inexperienced developer.
The app was likely not vibe-coded as none of the models of the past months would have
made such obvious mistakes."
Ouch.
In other words, vibe-coding would have produced better results.
This hack should have never happened.
The T app is an AI slop.
It's gross negligence from a likely single developer
with very little experience
that should not have been allowed
to publish such an application without supervision.
The app didn't get hacked.
It willingly published sensitive,
personally identifiable information to the world.
It's now time for sponsored news.
Observability for your GitHub actions.
If you ever stared at a failing GitHub action
wondering what on earth just happened, you are not alone.
CI logs are often a black box and not the fun kind.
That's why Depot just launched GitHub Job Details,
a new observability layer for your CI CD pipeline.
It lets you zoom in on what each GitHub actions job
is actually doing, real build times, dependency fetches,
computer performance and more.
No more guessing, no more click-fests through the raw logs.
Your team gets better visibility into bottlenecks,
misconfigured jobs or flaky performance
without rerunning things a dozen times.
It's observability built for CI,
not just another dashboard.
Check it out at depot.dev and thank you to our friends
at Depot for sponsoring Change.log news.
Price per token.
Is your LLM spending budget on the rise?
If so, Alex Elman has just the website for you.
Price per token provides up-to-date pricing info
for major LLM APIs, including OpenAI, Anthropic,
Google, and more.
It pulls data from openrouter.ai
and even lets you estimate the cost
of executing the same prompt across different models.
Plus, they just added image generation comparisons.
In the future, I assume every AI tool
will have a smart routing layer upfront
that dissects our prompts,
determines the sweet spot provider
of just enough quality at the cheapest possible price,
and dynamically routes it on our behalf.
In the meantime, tools like this one from Alex
could save you a bundle.
If you're remote, ramble.
Steph Ango of Obsidian fame makes a suggestion
for remote teams of two to 10 people like his.
Quote, create a personal ramblings channel
for each teammate in your team's chat app of choice.
Ramblings channels, let everyone share what's on their mind
without cluttering group channels.
Think of them as personal journals or microblogs
inside your team's chat app. A lightweight way to add ambient social cohesion." These channels should be at the
bottom of the list, muted by default, with no expectation that anybody ever reads them.
The Obsidian team has found them surprisingly sticky. Because they are so free and loose,
some of our best ideas emerge from ramblings.
They are often the source of feature ideas, small prototypes, and creative solutions to long-standing
problems. That is the news for now, but go and subscribe to the changelog newsletter for the
full scoop of links worth clicking on. Such as, most language migrations are hype-driven,
Most language migrations are hype driven. Modern Node.js patterns.
And typed languages are a better fit for vibe coding.
Get in on that newsletter at changelog.news.
In case you missed it, last week on the pod we hosted Greg Osuri on solving the AI energy crisis.
That episode has already generated a lot of discussion in Zulip. And on change talking friends, Adam and I sat down with the 2025 Stack Overflow Developer
Survey results to glean what we could glean.
Give those a listen, scroll up in your feed or down, I guess, depending on your sort order.
We have some great ones coming up this week too.
Both of our Denver live shows are shipping.
Our interview with Nora Jones comes out on Wednesday and on Friday, Kaizen 20 with Gerhard Lazu.
Have yourself a great week. Like, subscribe, and 5 star review us if you dig the show,
and I'll talk to you again real soon.