The Changelog: Software Development, Open Source - Tech is supposed to make our lives easier (News)
Episode Date: February 10, 2025Bill Maher excoriates the software industry for making our lives more difficult, two professors from the University of Washington put together a curriculum to help us manage life in the ChatGPT world,... Daniel Delaney thinks deeply on chat as a dev tool UI, Benedict Evans explores our assumptions that computers be 'correct' & the Thoughtbot team writes up six cases when not to refactor.
Transcript
Discussion (0)
What's up nerds?
I'm Jared and this is changelog news for the week of Monday, February 10th, 2025.
We are gearing up to play some Friendly Feud.
That's the new game show on changelog and friends that smells an awful lot like the
old game show on JS Partyog and Friends that smells an awful lot like the old game show
on JS Party, and we need your help.
Please take our brand new survey featuring all kinds
of geeky questions.
Find it at changelog.fm slash feud.
Once again, that's changelog.fm slash feud, F-E-U-D.
It'll be fun.
Plus, you might win a Change Log T-shirt for the effort.
Okay, let's get into this week's news.
Tech is supposed to make our lives easier.
In the run up to Super Bowl 59,
Bill Maher dropped a segment excoriating the software industry
and our relentless pursuit of change for changes sake.
What does that have to do with the Super Bowl?
You'll enjoy the Super Bowl this weekend while you can
because it's probably one of the last ones
to be shown on broadcast TV,
which is a shame because streaming is ruining football.
And that's Taylor Swift's job.
Mar launches into a curse-laden tirade on what he calls reverse improvement.
R.I.
The term is self-documenting, but he defines it anyway.
R.I. is making an upgrade to a popular product that nobody wants, needs, or likes.
Examples follow.
When I get a notice that my phone needs an update, it's like getting a jury duty summons. likes. So after we learn it, and now when I'm in an album, all the photos in that album float by
in a slideshow at the top.
I didn't ask for that, I don't want it,
and I don't want to have to go on a expedition
to find out how to turn it off.
The subject of his ire?
Silicon Valley nerds, of course.
We're all different.
You like over-engineering stuff, but don't tell yourselves you're making anyone's life better. The ire? Silicon Valley nerds, of course.
He goes on and on.
It's funny and contains a lot of truth.
The entire 844 is worth your time.
Hopefully this segment serves as a good reminder that our aim as software developers
should always be on improvement, not merely change.
Because if the technologies we invent
don't actually make people's lives easier,
what are we even doing?
How to thrive in a chat GPT world.
Two professors from the University of Washington
put together a curriculum
to help us manage the already here,
but not evenly distributed,
new world where AI systems
saturate our information environment
with BS at a scale we've never
before encountered.
For better or worse, LLMs are here to
stay. We all read content that they
produce online.
Most of us interact with LLM chatbots and many of us use them to produce content of our own.
In a series of 5-10 minute lessons, we will explain what these machines are, how they
work, and how to thrive in a world where they are everywhere.
Chat is a bad UI pattern for development tools.
Daniel Delaney thinks deeply on a subject I've been pondering of late.
What's the ideal language for specifying
software requirements that meets the correct middle
between humans and computers?
Is it English?
Is it Golang?
Is it somewhere in between?
Quote, AI was supposed to change everything.
Finally, plain English could be a programming language.
One everyone already knows.
No syntax, no rules, Just say what you want.
The first wave of AI coding tools squander this opportunity. They make flashy demos but
produce garbage software. People call them great for prototyping, which means don't
use this for anything real." I don't have a solution to this problem yet, but Daniel
and I both agree on one thing. Chat ain't it. Quote. Current AI tools pretend
writing software is like having a conversation. It's not. It's like writing laws. You're using
English, but you're defining terms, establishing rules, and managing complex interactions between
everything you've said. This is the core problem. You can't build real software without being
precise about what you want. Every successful programming tool in history reflects this truth.
AI briefly fooled us into thinking we could just chat our way to working software.
We can't.
You don't program by chatting.
You program by writing documents.
It's now time for sponsored news.
Six Predictions for AI in 2025.
The rise of specialty models is coming according to Augment Code CEO Scott Dietzen.
He says quote, with DeepSeq and open source AIs closing the gap, concerns raised about
LLM commoditization, advances in LLM reasoning, and questions about the future of the scaling
laws, 2025 is shaping up to be a tumultuous year in AI, and it's only February."
End quote.
Scott has some interesting takes from the front lines of AI tooling.
His third prediction in the list of six is one you're likely to appreciate.
It's coding AI's increased demand for software engineers.
Check out the entire list of predictions and what the Augment Code team is doing about it
by following the link
in the newsletter
or by heading to Augment code
dot com.
Are better models better?
Here's Benedict Evans.
Quote.
Every week there's a better AI model
that gives better answers.
But a lot of questions don't have
better answers only right answers.
And these models can't do that.
So what does better mean?
How do we manage these things?
And should we change what we expect from computers?
End quote.
Benedict's exploration into this topic is insightful and enlightening.
Comparing our plight with Gen.Ai's inability to be always correct,
with the original iPod's inability to withstand being dropped on the ground.
The common thread?
Shifting expectations.
Quote. After 50 years of consumer computing, we have been trained to expect computers to be right,
to be predictable, deterministic systems. But if you can flip that expectation,
what do you get in return? Reasons not to refactor.
I am a big fan of refactoring. So much so that in many aspects of my life, I stopped to ask myself,
how can I refactor this?
But refactoring is not always a good idea.
And our friends at Thoughtbot took some time to write down six cases
when you actually shouldn't refactor a thing.
I'll let you click through for the full list,
but the first one is super important and easy to fall prey to.
So I'll include it here for all of us to note.
Sometimes you only think you're refactoring.
Quote, many people use the word refactoring incorrectly.
If we're embarking on a change
that is not really refactoring, for example,
looking at a bug or an adjustment
after a third party change,
we can't fix it with refactoring.
What to do instead?
We need to think and talk about it differently
from refactoring.
We can stop and consider how the system will change from what to what and raise it with
teammates to discuss why it matters, what action to take, and when.
That's the news for now, but also scan the companion newsletter for even more news worth
your attention.
Such as Oracle defending their trademark by citing Node.js, Zach Holman on non-traditional
Red teams, and Redis creator Antires says we are destroying software.
Get in on the newsletter at changelog.com slash news.
We have some awesome episodes coming up this week.
On Wednesday, we're joined by Arun Gupta to talk about his new book, Fostering Open
Source Culture.
And on Friday, Jimmy Miller returns to discuss Discovery Coding. We're joined by Arun Gupta to talk about his new book, Fostering Open Source Culture.
And on Friday, Jimmy Miller returns to discuss Discovery Coding.
Have a great week, leave us a 5 star review if you liked the show, and I'll talk to
you again real soon.