The Changelog: Software Development, Open Source - Wisdom from 50+ years in software (Interview)
Episode Date: March 30, 2022Today we have a special treat. A conversation with Brian Kernighan! Brian's been in the software game since the beginning of Unix. Yes, he was there at Bell Labs when it all began. And he is still at ...it today, writing books and teaching the next generation at Princeton. This is an epic and wide ranging conversation. You'll hear about the birth of Unix, Ken Thompson's unique skillset, why Brian thinks C has stood the test of time, his thoughts on modern languages like Go and Rust, what's changed in 50 years of software, what makes platforms like Unix and the web so powerful, his take as a professor on the trend of programmers skipping the university track, and so much more. Seriously, this is a _must-listen_.
Transcript
Discussion (0)
Welcome back. This is the changelog. Thank you for tuning in. Today we have a special treat,
a conversation with Brian Kernaghan. Brian's been in the software game since the beginning of Unix.
Yes, he was there at Bell Labs when it all began, and he is still at it today,
writing books and teaching the next generation of technologists
at Princeton. This is an epic and wide-ranging conversation. You'll hear about the birth of Unix,
Ken Thompson's unique skill set, and how he bootstrapped it in three weeks,
why Brian thinks C has stood the test of time. We get his take on modern languages like Go and Rust.
Here's a teaser. He's bullish on one and the other one, not so much.
We ask him what's changed in 50 years of software and what's remained the same. What makes platforms
like Unix and the web so powerful? His take as a professor on the trend of programmers skipping
the university track and so much more. Seriously, this is a must listen. Oh, and to top it all off,
ChangeLog++ members, stick around to the end.
You get a bonus five minutes of Adam and Brian
nerding out about the Raspberry Pi.
Changelog++, it's better.
Special thanks to our longtime partners at Fastly.
They've been shipping all of our shows for years,
ensuring they reach you super fast wherever you listen.
Check them out at fastly.com.
Okay, Brian Kernaghan on the changelog.
Let's go.
This episode is brought to you by our friends at Square.
Millions of Square sellers use the Square app marketplace
to discover and install apps they rely on daily
to run their businesses. And the way you get your app there is by becoming a Square App Partner.
Let me tell you how this works. As a Square App Partner, you can offer and monetize your apps
directly to Square sellers in the App Marketplace to millions of sellers. You can leverage the
Square platform to build robust e-commerce websites,
smart payment integrations, and custom solutions for millions of businesses.
And here's the best part. You get to keep 100% of revenue while you grow. Square collects a 0% cut from your sales for the first year or your first 100 Square referred sellers. That way,
you can focus on building and growing your Square customer base and you get to set your own pricing models. You also get a ton of support from Square.
You get access to Square's technical team using Slack. You get insights into the performance of
your app on the app marketplace. And of course, you get direct access to new product launches.
And all this begins at changelog.com slash square. Again, changelog.com slash square.
Well, Brian, first of all, we really appreciate you joining us on The Change Log. Welcome.
Thank you. It's a pleasure to be here.
It's a pleasure to have you. You've been in the industry a very long time.
In fact, you have written the Unix history, a memoir, because you were there.
You were there when Unix began. Take us back.
You don't have to tell the whole story. Of course, people can read the book. You've put the work in to write it all down. But take us back to that time period and just
paint a picture of what it was like when Unix was born. Right. Yeah. Well, I guess the proper way to
describe it is present at the creation, but not responsible for it. But I was a grad student at
Princeton in the mid to late 60s. And I got a summer internship one year at MIT in Project Mac, which was basically building Multics, a very large information utility, so-called.
And then the following year, 1967, I guess, I got an internship at Bell Labs in Murray Hill, and people there were still working on multics.
I had nothing to do with that.
But I spent two summers of internship at Bell Labs, had a great time.
And so I went there permanently in 1969, very early 1969, and found myself in the same group as Ken Thompson and Dennis Ritchie.
And Ken, at that point, they were all, they had, I guess the right
way to say it is they were suffering withdrawal symptoms because they had been working on Multics,
which was very big, very interesting computing system. And they had really, really good
computing facilities. But then very early in 1969, Bell Labs withdrew from the Multics project
because it was clear it wasn't going to produce what it was supposed to produce, at least on a timescale that the labs cared about.
And so this left Ken Dennis and a variety of other people fundamentally with a taste for a nice computing environment, but no way to satisfy it really.
And so they spent a fair amount of time basically doing what you would call paper designs, sketching things on blackboards and so on.
But as part of that, very early in that year,
I think Ken Thompson found what is traditionally described as a little-used PDP-7.
And so the people who had been working on the Multics project,
Bell Labs with Drew, as I said, in 1969,
and the people who had been working on it, Ken Thompson,
Dennis Ritchie, and some others were left without a very nice computing environment,
you know, very large, powerful computer that they sort of had all to themselves.
And so they spent a lot of time basically doing what you might call paper designs or
blackboard designs of what an operating system might be like, what they could do. And they also,
Ken Thompson found what is traditionally called a little used PDP-7, a machine that was pretty
obsolete already in 1969. Nobody was really using it. And he used that as a vehicle for experimenting
with file systems. And at some point, this famous comment that he made that he had three weeks when his wife took their son off to California to visit the relatives.
And in three weeks, he put together what was basically the first Unix system, a proto version of Unix, because he already had the file system.
He needed an exec call, a shell, an assembler, things like that.
So three weeks, and we have the first version of Unix.
And that was in 1969. So you could argue then that 2019 would be the 50th anniversary of Unix.
And the Unix history book that I wrote that you mentioned, basically, I put that together as an
attempt to capture some of what I remembered and what lots of other people remembered of what was really kind of a golden era in computing.
Certainly the early days of Unix are very interesting.
The evolution of it over the next 20 or 30 years
is very interesting, at least to me.
And so that was the excuse for trying to write a book.
I think that's great because preserving this knowledge
is really so important.
And obviously having someone there during the born-on date, maybe not so much a contributor to actually making Unix a thing those first three weeks, as you mentioned with Ken Thompson.
Having that memory is super important.
I've listened to other interviews you've done with Lex Freeman and others just describing some of this history.
I think it's really important to draw those lines from past world computing to today's
world computing.
You know, I think like this PDP-7, maybe it's obsolete that you could still make Unix on
it.
And I draw the conclusion of like, say, a Raspberry Pi today.
You know, like the difference in terms of size and power is just profound.
I think for anyone listening to kind of go back and think like, wow, this is how it began.
This is what came out of it. This is the foundation that's been laid because Unix
is the foundation that we all build upon still to this day. You know, there was the lack of freedom
of Unix turned to universities, turned to this Linux system that was open source and this whole
movement that we're still sort of in. So I'm personally going to go back and read this book. I haven't yet, but I plan to because that knowledge is so important to preserve,
one, but then two, to reflect on, to know what the future might be because of what the past was.
Right. It's interesting. I don't know whether you can extrapolate, but the question of what
has changed over the last 50 years or so is just astonishing when you think about it.
It's sort of pretty much everything has changed.
You know, when I started programming, let's call it in 1965 or something like that, I
used Fortran and we used punch cards.
Remember punch cards?
You guys have never seen punch cards.
Sorry.
I've seen a picture of a punch card.
Yeah.
Yeah.
And that summer I spent at MIT in 1966 was a revelation because they had timesharing.
That is, there was a central computer and you accessed it from various kinds of remote facilities, some of which even used a phone system.
So in effect, it was kind of like cloud computing.
The early cloud.
Yeah, exactly. But when I went back and finished my degree at Princeton, I was still fortunate in punch cards in 1969.
And computers were expensive.
I mean, literally millions of dollars.
The computer I used there was an IBM 7094,
and it cost multiple millions of dollars.
It lived in a huge air-conditioned room.
And it was exceptionally slow,
probably a million times slower than computers today.
It had tiny memory, 64K, no, 32K, 36-bit words or something like that, really,
but physically huge. But Moore's Law came to the rescue, right? Things sort of get better,
let's call it every 18 months, give or take, things get twice as good. And so in 15 years,
that's a factor of 1,000. In 30 years, it's a factor of a million.
45 years, it's a billion.
And we're nearing 60 years from when Gordon Moore made his comments about doubling back in 1965.
So smaller, cheaper, faster, better, exponential.
And that's what makes a lot of this stuff go.
That's why you and I can have a conversation even though we're scattered all over the place.
And we all have these powerful computers and
everybody is carrying an exceptionally powerful computer with them all
the time. And it's connected to everybody else's powerful computer. So
an enormous amount. It's very different.
Is there anything that's the same? Is there anything that remains? You have maybe some fundamentals
of coding or
practices, anything that just has a through line from then to today? I think there are a lot of
things that remain the same. One thing is probably just that our aspirations always exceed our
abilities. At some point, we need more memory or more speed than we have. It's hard to scale up. And so we still
run out of resources or need that exponential growth to keep going, to keep up with what
people want. The other thing that really is the same is that people are the same. We're complete
screw-ups in a variety of ways. We make mistakes. We're slow. We're unpredictable in lots of
different ways. So programs that programmers, even very talented programmers working hard, they make bugs in
programs. The programs are often hard to change. They have clunky interfaces. And then, you know,
most people are good, but there's always a collection of bad guys. And they're still very
much with us at this point. But interestingly, it's something that's different the bad guys are
far away but can still reach out to touch us through things like the internet so you know
a lot's the same as well as a lot of difference what's interesting about the birth of unix is
that you all were co-located in a building or a few buildings or a room at times whereas things
that would be invented today
that may become the underpinnings for the future you know technologists unlikely to be in that
same room now maybe it will be but there's so much innovation that's happening remote collaboration
that it's very possible that people who are inventing things today are halfway around the
world from each other you know that's such an interesting point.
I honestly don't know.
You were absolutely right that Unix work was done by a handful of people in a very small space.
Typically, they were sitting in the same room.
And that meant that the bandwidth for communication among the people was extremely high.
And I think that that is hard to replace over remote connections like the one we're using.
I think that people working together in the same space is often very productive.
On the other hand, if you look at these sort of cube farms, or not even cube farms, just, you know,
tabletop farms that lots of big companies provide now, I think that's counterproductive because you've got too many people in the same space and it's very distracting. And I've experienced that to some
extent and I hate it. I find it just very, very hard to work in that kind of environment where,
you know, you're literally three feet away from the next person. So there's some kind of trade
off there. And I don't't perhaps the pandemic and all this
experimentation with remote work and then hybrid modes and so on maybe that will lead to something
which gives you some of the combination of the good stuff without so much of the bad stuff
we shall see yeah there's some sort of spark and creativity and there's like something in the room
sometimes i think even teams that go remote or are remote they still have their summits they still have their times where it's like we need to get six
people in a room and just hash things out and the amount and when you're in that room it feels
different and then you're like okay like this three-day period we got more done than we got
in the previous three weeks because we did do that yeah that being said the ability to bring
together different minds who are geographically distributed around the world
and not have them have to sacrifice their life or their lifestyles in order to collaborate is also really powerful.
This kind of reminds me, Jared, of when we talked to Jessica about mob programming.
I don't know if you heard of this, Brian, before, but there's this idea that you can mob program.
You can get together four or five people.
It could be a designer, it could be a developer sharing the same terminal, all focused on the same problem set.
But they're probably in most cases remote.
I think in many ways that kind of is trying to recreate what you all had back in Bell Labs.
And obviously in that day, you had no choice.
That machine was not mobile, right?
So you had to be co-located with it.
And in many cases, while it was a massive machine, underpowered in comparison to today's machines, you even mentioned the PDP-7 was kind of obsolete, which is what the Unix was born on, I kind of wonder if the reason why Unix can run on
sort of like commodity hardware is because it was designed
to work on a machine that had limits. So those limits would
always be constrained. So a piece of commodity hardware would be
limited, but eventually cheaper. I'm wondering if that might be
a similarity there. I think there's a real
element of truth in that the original PDP-7 was extremely limited. It had, if I remember correctly,
8K 18-bit words, so you couldn't get carried away there. And the first PDP-1120 that we got was
not a heck of a lot bigger. I've forgotten, but maybe call it 24k
bytes or something like that. So it's really very small and if you don't have
much memory and the machine is intrinsically not very fast then it
enforces a sort of discipline and it also I think encourages you strongly to
think about what's the simplest mechanism that will do the job for as
many different things as possible. So find simple but general mechanisms. And certainly an awful lot of what went on in the early Unix was exactly
that. I mean, think about the file system. The idea of the hierarchical file system came from
Multics, probably others as well. But the Unix implementation of it was extremely straightforward
and simple, very clean. The system calls to access it, what,
there were half a dozen at most to access and do anything you wanted. And all files were the same.
And then it's kind of this freebie idea that went along with it, that devices were files in the file
system as well. So that was an example of very clean, simple idea hierarchy. And then, gee, a
generalization, we could put more than just disk files into this mechanism,
and it will work the same way.
And so I think a lot of that was encouraged because there were not a lot of resources.
And so contrasting today where, you know, for most people,
the memory on their computers is infinite, and their bandwidth is infinite,
and the computers are so fast it doesn't matter.
And so you can trade off.
And for most purposes, a fine tradeoff, just waste more of the computer to make the people more productive in some way.
But there are times when you can't do that.
So I want to ask a question about Ken Thompson.
There's a lot of people in that room.
You've named a few yourself as well. So he's not like singularly to credit for these things, but it seems like he was an amazing software developer
and you've taught probably hundreds, probably thousands of software developers down through
your time teaching, not to mention through your books. I'll mention your C programming language
book was not my first book in college. It was my second book.
I actually started intro to programming was a C++ course.
Didn't do so hot.
And I took C.
And so I got your book in that one.
And I actually thought C was easier than C++
because your book made it very easy for me to understand.
So thank you for that.
You've taught a lot of software people over the years.
You've seen a lot of software developers yourself.
You know, co-authored AUC.
You have your own bona fides when it comes to writing code. Was Ken Thompson, do
you think, a unique, amazing coder? Was he in the right place at the right time? Is he
a standout in terms of just like once in a generation kind of a software developer? Or
are there a lot of people that you've seen that have been just as good as he was, but
he happened to have that nugget? He he happened to be the right place the right
time with the right idea and the right people i think he's a singularity i have never seen anybody
else who's in the same league as him you know i've certainly met a lot of programmers who are very
good yeah and you know some of my students sure uh the people i worked with at bell labs very good. And some of my students, sure. The people I work with at Bell Labs, very good. But Ken is in a different
universe entirely as far as I can tell. And it's a
combination of a bunch of things. I mean, just being able to write code
very quickly, that works. Very, very well done
code. But also this insight into solving the right
problem in the right way and just doing
that repeatedly over all kinds of different domains. I've never seen anybody remotely like
that in any setting at all. One night, he and Joe Condon and I, we had gotten a new typesetter at Bell Labs.
It was basically a device controlled by a very small computer inside a computer automation naked mini, if you wish to know.
You know, just a generic kind of mediocre 16-bit computer.
And the typesetter came with really awful software.
And so you couldn't figure out what was going on and of
course you didn't get source code you just got all right something that ran and so ken and joe and i
were puzzling over what to do with this thing and i late afternoon i said i'm going home for dinner
i'll be back in a while and i came back at sort of seven or eight o'clock at night and ken had
written a disassembler for this thing so that he could see what the assembly language was
so that he could then start to write well of course now you write the assembler and then you and you know that kind of thing where
in a couple of hours he had built a fundamental tool that was then our first toehold into
understanding the machine now you know writing a disassembler is not rocket science but on the
other hand to put it together that quickly and accurately on the basis of very little information. Now, this is before the internet when you couldn't just sort
of go and Google for what's the opcode set of this machine. You had to find manuals and all
this kind of thing. So now off scale. And he just kept doing that over such a wide domain of things.
I mean, we think of Unix, but he did all this work on the chess machine where he had the
first master level chess computer.
That was his software.
And he wrote a lot of CAD tools that made it go as well.
And, you know, he built a thing that was like the Sony Walkman with an MP3-like encoding
before anybody else did because he talked to the people who knew how to do speech
coding down the hall it's on and on and on and on you've said before that uh programming is not just
a science but also an art which leads me to believe that for some reason ken was blessed with this art
side of the of the science so you can know how to program you can know how to program well with less bugs but
to be able to apply the thinking to a problem set in the ways you described ken what do you think
you know without describing his you know for lack of better terms genius what do you think helped
him have that mindset like what how did he begin to solve a problem, do you think? You know, I actually don't know. I suspect part of it is that he had just been interested in all kinds of things.
And, you know, I didn't meet him until he and I arrived.
He arrived to the labs a couple of years before I did.
And then we were in the same group for many years.
But his background, I think, originally was electrical engineering.
He was much more of a hardware person, fact than a software person originally and perhaps that gave him a different perspective on how things work or at least a
broader perspective i don't know about let's say his mathematical background but for example you
mentioned this art and science he built a regular expression recognizer which is one of these things
that dynamically adapts to what's going on so that
it can process things in linear time which with you did it dumbly would be exponential
in either space or time basically a lazy evaluation mechanism for regular expression evaluation
and it just goes on how do you get there i don't know i'm not that person
so if we go from ken Thompson to Brandon Eich now.
So Ken Thompson, this famous three-week stint,
probably wasn't Mountain Dews in the room,
but I just imagine him at a terminal, just pizza delivered,
not leaving the room.
Who knows what actually happened, but there's your stereotype.
Brandon Eich, infamously perhaps
designed JavaScript in 10 days
and that was a circumstance where I think it was a pressure cooker.
They had 10 days to do this, I think.
Whereas Ken, his pressure cooker was like my wife and kids
out of town for a few weeks and I can do this.
But it's just interesting, these two platforms,
so to speak, one the programming language,
one an operating system, of course,
both, at least the core of them,
designed or implemented or both
in such a short amount of time.
Do you think this is just a coincidence?
Do you think there's something to this?
What are your thoughts on that?
It's an interesting parallel.
I don't know Brendan Eich at all.
I've never met him or anything like that.
Yeah.
I think JavaScript, I mean, you can dump on it, but, you know, it has an enormous effect on the world.
And I think it was an excellent piece of work.
And 10 days, sure.
More credit to him for being able to pull that off.
I think, is there anything to be learned by saying here are two
examples, therefore it could be done more broadly than, you know, everything we do can be done in a
couple of weeks? Probably not. But in some respects, the core pieces of these things are
relatively simple. So suppose I was going to create a Lisp interpreter, right? So I could,
not me personally, but lots of people could probably put together a Lisp interpreter in a day or two because it's fundamentally simple and core.
You can get off the ground very quickly.
And then you can spend a lot more time making it more efficient or more expressive or whatever.
But fundamentally, it's pretty straightforward.
And I think that the same kind of thing would be true of an interpreter like JavaScript.
My personal experience, and not to be compared with Brendan Eich at all, but for example,
Ock, which is a programming language that Alejo and Peter Weinberger and I did, we thought
about the design for a few weeks.
And then Peter Weinberger went off and built the first implementation over a weekend and so that's an interpreter at very loosely the same kind of level as javascript
the same um the reason that peter was able to do that over a weekend twofold i mean one he's very
very very very smart experienced programmer um the other, he had good tools to build it with.
So he was able to use YAC, the compiler compiler,
to define the grammar and hang semantics on it.
He was able to use Lex, the lexical analyzer generator,
to do lexical processing.
And then the rest of it is, well, you just build the tree
and then you build the thing that walks the tree.
And if you've done that before, the next one is easier.
And so I don't know, for example, whether Brendan Eich had done some other kind of language work before that gave him a leg up.
I believe so.
I wouldn't be surprised.
And so experience like that does help you to build something quickly, but also to see which parts matter, which parts are the ones that get you off the ground
and which you can just kind of ignore for a while.
So my guess is that lots of things can be built.
Call it the MVP.
It's a badly overused acronym, but, you know,
what's the minimum thing that would actually kind of prove your point,
be useful, and tell you whether you want to go further or not
and so my experience is small tools small languages things like that but you can see
other things where people get off the ground quickly i mean what about wordle i've never
played wordle but you know gee i look at it and think wow that's kind of neat what an idea yeah
and how hard could it be to build that probably not terrible i don't want to denigrate
the guy did it because it's a really nice piece of work yeah i mean the game mechanics are i think
where the genius is in wordle and i think to your point it has been recreated over and over again it
has inspired clones and ports and wordle in this language and world in that language and wordle
with this word set i think there's even like a POSIX version of Wordle.
Because the problem set is well-scoped and well-defined
and it's not that complicated. But there are some interesting aspects of it which
make it a fun thing to build. So I think that's a good point there.
It's the constraints, really.
The constraints I see seems to be the thing.
Like one, you hear this again and again in history where it's like someone created a logo for a brand and they did it in five minutes.
Well, it actually didn't take them five minutes, actually took them maybe 15 or 10 years or five years of experience to then be so efficient.
And as you had said, Brian, with the right tooling at the time of creation.
So it's kind of like constraints plus experience, you know, plus tooling that really enable
this creation to be so condensed.
Yeah.
Yeah, I think so.
You have to have all of the preparation, your own expertise and experience, and an environment or infrastructure or whatever
that supports what it is you want to do. I mean, look at Napster. Napster's at this point over 25
years old, but very, very neat idea. Not too hard to get off the ground, given that you've got the
internet and you've got tools like Visual Basic or whatever for building Windows interfaces. You've
got a fairly homogeneous environment for people to play with it.
So again, not to denigrate Sean Fanning for what he did there,
but given all of that stuff and given that he's probably thought about it very hard for a long period of time,
then putting the thing together is not that bad.
Say I, never having done it, of course, but that's a different story.
It reminds me of this story, this old, you know, this Bible preacher preaches a message and then
afterwards one of the hearers goes up and asks him like, well, thank you so much. How long did
it take you to put that thing together? You know, how long did you work on this? And he said, I
worked on it my whole life. And so that's kind of what it is. Like your life of preparation and
experience actually puts you in a place
and a time with a skill set and a perspective that makes things that are amazing,
even though in and of themselves,
they may just be like this small scoped, constrained thing.
But it's that combination of it that really brings it all together.
Precisely. Yeah.
This episode is brought to you by our friends at Influx Data.
Act in time.
Build on InfluxDB.
This is the platform developers use to build time series applications.
And today I'm joined by Barbara Nelson, VP of Application Engineering.
Barbara, we're working together to share some behind the scenes there at Influx Data.
And one of the things you've shared time and time again is this idea of meeting developers where they are.
What do you mean by that?
This is really important to us that we're not expecting developers to make wholesale
changes to their product or application to try and leverage the
power of our platform. So it's a mindset both in terms of the capabilities of what we deliver and
how we deliver them. So why do we have the client API in 12 different languages? Because we're
meeting developers where they are in 12 different languages. We're not going to tell them, if you
want to use our platform, you have to use Python. If you're using C Sharp, you use our platform in C Sharp.
That mindset of meet the developers where they are means we sometimes end up building
multiple versions of the same thing, but for different target audiences.
So a lot of the capabilities that we expose in our web UI, we also expose in our VS Code
plugin. Some developers are spending
all their time in VS Code, so they want a solution that works where they are today. And so that is a
really important focus that we're not trying to tell the developers, you know, you need to change
to use our platform. It's what are the ways that we can make our platform accessible to you the way you work today,
the way you develop your application today?
And so that mindset has been really important.
It means that we often develop capabilities at different levels.
So we'll have the same capability.
You can access it through our web UI.
You can also access it through a set of command line scripts.
You can also access it directly our web UI. You can also access it through a set of command line scripts. You can also access it directly via API calls. And all of that gives the developers the flexibility to use
the platform the way it works best for them. Okay, you heard it here first. Meet developers
where they are. That's the mindset of Influx Data, how they build, how they ship, how they think about
DX in terms of you using this platform to build your next time series application.
Bring your preferred languages.
InfluxDB integrates with 13 client libraries, C Sharp, Go, Ruby, of course, JavaScript, and so many more.
Learn more and get started today at influxdata.com slash changelog.
Again, influxdata.com slash changelog. Again, influxdata.com slash changelog.
So, C. so c let's talk about the c programming language i mentioned how your book has taught probably multiple generations at this point how to code in c you co-authored that with dennis richie the
creator of c and c has been extant in huge numbers for many years and continues to be today a very viable
and powerful programming language that people probably are picking up right now and writing
something new in C as we speak years and years and years after its inception and creation.
What do you think it is about C that has accounted for the longevity of its success?
I think probably it hit sort of a sweet spot, a bunch of competing or important areas.
It's efficient, and it was really important that it had to be efficient at the time this was,
Dennis did it originally in the very early 1970s, because as we mentioned earlier,
machines are not very powerful, don't have much memory.
So efficiency, expressiveness, it really let people say fairly clearly and easily what they wanted to say in a form that was a good match to what was going on in the hardware underneath it.
You could see a mapping between what you wanted to say and what the computer would actually do underfoot.
It was complete in the sense that you didn't need anything else. You could write useful stuff
with nothing beyond that. And I think it was completely comprehensible to programmers. So
you could pick it up and you could learn how to use it fairly quickly and fairly well. And I don't
think any language has, other language has done that quite so well.
I mean, obviously every language has things that it does very well and things that it's perfectly adequate for in other places where people complain.
But NC is like that, has lots of flaws.
A lot of those are historical necessity because of limited resources. But I think it's outweighed by that combination of efficiency and expressiveness
and suitability for the job. The other thing about C and the reason why it's still there,
I would say, or at least one of the reasons is that it has benefited over and over again by
waves of hardware evolution. So it started with many computers like the PDP-11. It was there for
the workstation market like Sun Microsystems and lots of others.
In fact, the existence of C and Unix enabled that workstation marketplace in the late 70s,
early 80s. It was there for the IBM PC and all of the follow-on machines of that. So that's a
third wave. And we see embedded systems at this point, little tiny computers for which C remains suitable and probably best
because you need that efficiency, speed and memory use, and often no runtime support.
Right.
So all of those things, I think, keep giving C like another burst of life.
And we'll probably keep it going for a while.
Yeah, it seems like mobile, the advent of mobile and IoT
has really added to the longevity
of those kinds of languages
because whereas we used to go higher
and higher up the stack,
more abstractions, memory management,
et cetera, et cetera,
scripting languages,
because we have these constraints
are lifted in many situations.
But all of a sudden,
a reset back to highly constrained devices
when mobile took off.
And of course, the mobile phones now are very powerful
compared to what they were 10 years ago.
But your refrigerator probably doesn't have a very powerful chip in it
or your dishwasher or these things when people are coding.
Are there any problems today, like a specific domain, like a
text editor, or something like where somebody said, I'm going to write a brand new thing. And
you would say, Brian, you would say, you should pick up C and write it in C, or would you never
advise C today? I think probably unless you are in one of these resource constrained environments,
like clearly right up front
that you're going to be resource constrained and, you know, the improvement of hardware isn't going
to rescue you in the next couple of years. I would not start with C. I really wouldn't.
And then it depends, you know, what is your application? So for example,
some random kid at school wants to know what's the first programming language to learn.
Python, probably, because you can do all kinds of neat things with it.
It is very expressive.
It is adequately efficient for most purposes.
And it has an enormous library of stuff that just all gets, it's really easy to use.
So that generic question of what the first programming language might be, not C.
That'd be nice if people did, but I think Python, for many purposes, would be a better choice.
And of course, the reason that Python works so well in many cases is that very often what you think of as a Python function or module is, in fact, just a bunch of C code through a foreign function interface.
True. Now there's been a concerted effort of late to replace
many of our core infrastructure projects that are written in C
our routers, our web servers, our proxies, our you name it
with memory safe languages like Rust.
What do you think of that effort? Are you for it? Do you think it would succeed?
Is there just too much C code out there that it'll always exist and be executable on our servers? What do you think about that? have to access memory in an unconstrained way. And then unfortunately, that translates into
the programs that ordinary mortals like me write, where you access the wrong memory in an
unconstrained way, and things go bad in a variety of ways. And so C has lots of that problem.
And so replacing critical pieces of software with something where that memory corruption or
access out of range or all these other kinds of things where that is in effect legislated out of existence.
That sounds like a great idea.
Is Rust the right language for that?
I don't know.
I've never gotten into Rust.
My one foray into it founded on the fact that language and its documentation were changing at high speed and differently.
And so I couldn't get something that worked.
Well, you're an early adopter. You came in early.
Well, adopted, unfortunately, is the wrong word.
It was an early abandoned ship.
An early abandoner.
I mean, Rust has clearly many positive properties,
but I just don't have anything to say about it.
But the basic idea, I think, is perfectly sound.
The problem is if you go through and try to improve the infrastructure or any program, what you're doing is changing
things. And so what you want to do is do it in a way where the external properties, all external
properties remain the same, but the internal properties are better. And it's hard to do that.
And so the question is whether the improvements that you're making will improve it, or will you just change behavior in invisible ways? Will you head off
bugs or will you create new bugs? And so the short answer is, I don't know. And I don't think our
ability to test and verify programs is at the state where you can be really sure. Even just making simple changes is hard work to make sure that they're correct.
I was reading something the other day about how the Linux kernel,
they're proposing to update the version of C there from whatever it is.
It's probably C88 or something.
Improve that or upgrade it to a much more modern version of C.
And I suspect that's going to be hard work because you've got, what, 20 million lines of C code there.
And can you do that without breaking something?
Hard to say.
It's a lot of lines of code.
Given that, then, let's maybe hypothesize a bit then or maybe share some ideas here on not throwing the baby out with the bathwater.
So if the baby in this case is C and there's lots of lines of code out there, that is the baby, right?
And the bathwater is the insecurity and the memory safe concerns for particular software that lives network connected.
So web servers, things like that, routers and whatnot.
And this is where the attack surface lives. If the baby is C and there
was no alternative called Rust or a future language or more modern language that, you know,
sort of diminishes these concerns, how would you propose or suggest or whatever we not throw the
baby out with the bathwater and modernize C in a way that becomes memory safe.
What can we bolt on to safe? Is there a possibility to just augment C to be more memory safe concern?
Or I don't know enough about the language to sort of go deep with you on that.
So I'm just curious if there's a way to keep C, but memory safe it.
There's a thread in at least academic settings and probably others as well,
which says, let's take C, but then do something that makes it safe. And so there are languages
like safe C. And there are people who make subsets of C. Here's the safe subset or the
verifiable subset or the trusted subset or whatever. And so these have been an active area of research for decades at this point. I don't
think any of them have had a measurable effect on practice. Yeah, the only one that I had any
real experience with, not very deep, is with automotive software. So a lot of the software
that runs in your cars is written in C for good and sufficient
reasons. And I worked for a while with Gerard Holtzman, who at the time was, he was a colleague
at Bell Labs. He was then at JPL. And he was interested in how do you make reliable software
for basically space missions, like the Mars rover and that sort of thing. The automotive industry uses C, and they have a standard,
MISRA, the Motor Industry Software Reliability Association,
or something like that.
It's a standard for how do you write C so that it will be safer.
And some parts of that standard are machine enforceable.
You know, you shall not do this, you may not do that,
and we can check it mechanically.
And some of them are more like statements of good intentions, which are not checkable. And people try to stick to that
standard in the field, but it's imperfect. And so your cars still have potentially software problems.
And I suspect the same is going to be true across the board, that you can improve the situation with C code. Some combination of tools and checkers, some combination of limitations.
For example, one of the standards for the spacecraft is that
you do not do dynamic memory allocation.
Okay, all memory is allocated at the beginning.
And so you don't have these, you know,
That sounds like fun.
multiple frees of the same block or all these other things that go down in flames.
So some combination of good behavior, legislated good behavior, checks, careful testing, and so on.
All of these will improve the situation, but I am a little dubious that it will completely solve the problem. And then if you come along with a language like, well, Idle Rust, which I don't know
enough about, it, I believe, certainly solves some of those problems of memory allocation,
but it probably has other problems as well.
In my experience, it came with an enormous collection of library stuff.
How do I know that works?
And that's going to be true of all languages, no matter what.
There's always going to be ways in which you can screw up.
So are we just hosed then, or is there hope anywhere?
I mean, how do we secure our systems?
I think you're hosed.
Okay.
Okay.
Since you mentioned cars, there's two people in particular that are pretty bullish on Rust. And obviously you've mentioned that you don't know enough about Rust, deeply enough to know the concerns or lack thereof if there aren't any.
But in particular, Elon Musk is known to be bullish on Rust.
And then a counterpoint is Jack Dorsey, famously created Twitter, Square, a forward thinker on Web3, which we'll probably talk about to some degree in terms of decentralization of the computer, and just obviously cryptocurrency.
Those two people tend to be, in quotes, thought leaders or influencers or mega serial entrepreneurs that have widespread, almost cult-like followings.
And therefore, they're –
Well, and investors, right?
They can actually put their money into advancements, yeah.
But those two in particular are known to be bullish on Rust.
So I just thought I'd throw it out there since you mentioned cars
and that sort of standard.
I don't know if Tesla, I know they use Python.
I know they compiled down to C++ just based on a simple Google search.
I'm not sure if that's fact-checked or not,
but they use Python in a lot of ways and that compiles down to C++ just based on a simple Google search. I'm not sure if that's fact-checked or not, but they use Python in a lot of ways, and that compiles down to C++.
I'm sure they do others, but those are two folks that are in those spaces that tend to be bullish on Rust.
I will defer to their expertise in this.
I don't know.
I mean, it's clear one of their core competencies is making money.
Another is actually getting things done.
I'm full credit to both for that.
But after that, I don't know.
Software still depends a lot on detail.
Yeah.
So a language that you know more about, which is more modern than C, is Go.
In fact, you wrote a book on Go.
And so that one, while Rust was a quick abandonment for documentation and other reasons,
Go seemed to have catched your interest and kept it for a little while, at least long enough to
write a book. Do you want to tell us about Go, what it impressed upon you or why you liked it
or like it still or your thoughts on Go? Yeah. So my experience with Go is kind of weird in a way. I often spent summers at Google in New York,
and one summer I was sitting adjacent to Peter Weinberger, an old friend from Octa is, of course,
and one of the other people right there was Alan Donovan, because they were all working on Go
in New York. And so, in effect, I was an intern for Alan that summer, and I wrote some Go.
And we got to talking about the state of the art of Go books.
And his contention, and I think he was absolutely right, is that there were no really good books on Go at that point.
And so I said the obvious thing, well, then you ought to write your own Go book. And so we did it together.
But truth be told, approximately 90% to 95% of it is him.
I mean, he's an astonishingly good programmer, and he is also a very, very good writer, and he knew Go inside out.
So whatever is good in the book is Alan's work.
I am not much of a Go programmer.
I could sort of cope at the time, and I haven't done a lot with it recently. The place where I found it particularly good was that I used it for basically crawling kinds of things, where you would start up a process to go and look for
something somewhere else, and then you'd want to have a bunch of those running concurrently,
and then just grab their results as they came back. So think of it as a crawler is the simplest
example of that sort of thing. And expressing that
in Go was just so much easier than expressing it in Python threads. And it seemed to run faster in
at least the specific case that I was playing with. And so that's a part of the language that
I liked. It was culturally compatible with C. You know, it sort of looked like a modern version of
C, although there were some weirdnesses that took a while for me to get used to. And so in that sense, it all seemed pretty good. And of course, two of the three creators
were good friends, Rob Pike and Ken Thompson. And their both good judgment and implementation
skills are pretty remarkable. So it all sort of hung together in that way. But I just don't
write enough Go to have an informed opinion about,
you know, should you use that or should you use Rust or would those solve the problems of mankind
for you or not? Well, one of the things about Go, which it shares with C, is the simplicity,
right? So I think Go has something like 25 keywords, maybe less, I'm not sure the exact
number, but not very many keywords. You can learn probably the entire breadth of's been alive as a thing,
which is over 10 years now.
Until now, with Go 1.18,
they are finally landing this new generics feature.
It's highly controversial.
Some people think, and these are gophers.
Some gophers say, we don't need generics,
we don't want generics.
We have CodeGen, that's good enough.
Others say this is going to bring
an entire new group of people into Go. It's going to make it much more expressive and useful.
Curious, your thoughts on that big feature, which has not created a Go 2.0. It's still
backwards compatible, but it is complex. It took a couple of years to get in. Lots of iteration on
the design of the feature, and now it's landing. Do you think this is good for Go as a language?
Do you think it's perhaps departing from its simplicity? What are your thoughts? The same disclaimer, I guess,
is that I'm not writing enough Go to have a really informed opinion. I think in some settings,
generics are actually helpful because that way you write the code once and then instantiate it
with different types. And that way you don't have to think about it as much.
And certainly I've used languages with generics, C++ a bit, Java more, and they're very helpful
for certain kinds of things, no question at all. So does that then follow in Go? I'm not sure.
As I say, I don't know enough. I think the answer is probably yes, but the reasoning is more based on the people and the process by which Go changes than on the technical content.
The Go evolution process has been exceptionally careful and cautious.
You know, Go remains backward compatible right back to the beginning.
If you wrote a Go program 10 years ago, it'll still work, no problem. And that is something that, well, we mentioned Rust,
which seemed to be changing very rapidly, at least for a while. And Python certainly had
been bitten by Python changes going from two to three and et cetera. So if they make a change
of substance, like the addition of generics, it's been exceptionally carefully considered
by people who actually know what they're doing. Now, the fact that the people are still debating it says, well, people can differ.
Difference of opinion is what makes horse racing.
And so I don't know enough about it to have anything more than, well, I'll put my face
in people who have actually studied it hard and decided in the end that it's a worthwhile
thing to do.
Those generics, do you know, Jared, if generics
means that you have to do it a certain way? Is it by force or is it just the availability of it that's
the controversial aspect of generics being added?
Yeah, it's new surface area, so you can just completely ignore it if you want to.
I think a lot of the concern at this point is not the feature
as implemented. It is not the feature as implemented.
It is how the community at large will use and potentially abuse the feature because of the excitement and the ramifications of that,
maybe not in the standard library, but in packages that people use
and popular things.
It might make it to where people abuse generics because they're so excited
that it exists, which I think is also the consensus around Go routines,
was that because Go made that whole deal so easy and nice to use,
people were using it everywhere.
And it ended up making Go programs more complicated and hard to maintain because of that.
So I think at this point, that's most of the reservations, I think.
The design of the feature and I think the performance implications,
which was also a concern, is like,
is this going to slow Go down quite a bit?
Because one of the things Go's famous for
is being extremely fast, even to compile.
And like, is this going to really reduce compile times
was a question.
I think now it's like, hey, are people going to abuse this
to the point where all go
example code and libraries is just like has generics flung around everywhere and we don't
want that to be the case i think it's more of a cultural thing at this phase then yeah and time
will tell i guess on that front yeah well we're talking about big changes to things i wanted to
loop back around to unix a little bit, even though we're far afield
from it, because I have a question about the web. Now, you pre-exist the web, the World Wide Web.
First of all, I want to compare the web with Unix. But before we do that, when the web became a thing
back in the 90s, Tim Berners-Lee and the WWW, that whole deal, where did you stand on it? What
did you think? Were you an early adopter
or an early abandoner of the world wide web did you think there were other things that were better
gopher i don't know what were your thoughts when it first came around was that going to be a passing
fancy because a lot of people panned it like this thing is not going to take off and it clearly has
you know it's sort of embarrassing to admit i guess but my first encounter with the web was i
was visiting cornell uh know, I went,
gave a talk in computer science department there. And I was visiting somebody who I think was
actually a physicist. And he showed me this weird system that they had where you could kind of,
you know, type numbers and it would give you access to various physics literature. You could,
you know, get a copy of a paper or something like that. And this would have been
probably roughly the fall of 1992 or something like that. And I looked at it and said, basically,
so what? I'm not sure I phrased it that way for him, but it was like, and in hindsight,
if I had been smarter, I would own you guys. I would own everything.
No, I said, don't take my advice on what the future of anything is going to be
uh i just i blew that one completely sadly well it happens i think uh the ipod was famously panned
by the who was it the creator slash dot when the ipod was first announced he had a now famous
steve ballmer also threw it down,
but they also had their competing product eventually.
Well, Steve Ballmer was laughing at the iPhone when it was announced.
The iPod, he said something like,
one gig of storage smaller than a Nomad lame
or something was his quote.
And the iPod, of course, was the beginning
of Apple's big run in innovation.
So it happens to the best of us. And clearly.
I recall Bill Gates being on Letterman.
And Bill Gates was trying to describe what the web would be.
And this was in the 90s, in this initial phase.
And Letterman, David Letterman was like, but he's also, Bill Gates tends to be a punching
bag to journalists or pundits or folks in David Letterman's standpoint.
Comedian.
Yeah, exactly.
Yeah.
Comedians, sure.
I guess that's, yeah, he's probably more comedian than he is a pundit, although both sometimes.
You know, he was, he's like, what is this at symbol?
And just like sort of like making fun of Bill Gates.
And Bill Gates is like trying to describe, you know, if if you watch it now you're like he was describing the future and David Letterman
was totally laughing at him and I think we often don't get a chance to talk to someone Brian that
has predated the web and I think you know I love that aspect of you that you're like I don't want
to tell the future but just knowing the, you know, when past meets future
and your response to that moment is priceless to me.
So I appreciate that.
So when did you finally come around?
Because here we are, it's 2022.
We're all using the web right now as we record this.
It's an amazing web application, right, in our browsers.
Surely you may have panned it or thought it wasn't going to be big,
but at a certain point
there was adoption and you probably yeah hopped on board no actually my first web experience i was
still at bell labs and it was very early days i've forgotten the date but let's call it 95 netscape
had just appeared in the guise of mosaic so this maybe that's more like 93 or something like that.
And one of my colleagues, Eric Gross, had the idea we could take AT&T's.
Remember, AT&T provided phone service for the world at that,
or for the country at that point, much of it.
And it had this 800 number directory.
So if you wanted to know the number for United Airlines,
their 800 number, you could look it up in this thing.
And it was a paper book,
which was published every six months or something, just like old-fashioned phone directories.
And Eric said, gee, you know, maybe we could put the 800 number directory on the internet,
on the web or something like that. And so he and I and a couple other folks basically cobbled
together something which was just straight HTML with links and so on so that you could go to
this website and it would give you the 800 number directory for at&t you know i've forgotten the
number but it was probably millions of modest millions of records and so that was at&t's actual
first web service and of course nobody in the company knew what to make of it and so we had a
lot of flap getting it i mean we had a prototype running in an hour
or two, as you can imagine, because it was trivial. And then it took us months to get it out of AT&T,
to make it visible on the outside. And the only thing that pushed AT&T over the edge
was a rumor that MCI, another company no longer with us, was going to release a web service of their own.
And AT&T wanted to have the credit for having the first web app
from a communications company or something.
So we got approval to put it out.
But it was kind of silly.
But I thought the web was a great thing right from the beginning.
What was the stack for that?
Can you recall?
Like, was there a database?
Was it just simply HTML?
Was it, you know, what was some of the hierarchy?
The original 800 number directory was just flat text.
You know, it was basically, here's a number, here's a name,
and a scattering of other things related to it.
It was literally flat text, just one big file.
I still have it.
Did you at least use OctaGenerator or anything?
I don't remember what we used.
I could probably do it with a text editor because it wasn't that huge.
I still have it floating around somewhere to see what it was.
And it was, the other thing that was interesting about it,
it was just riddled with errors.
It was indescribably messy.
I mean, how many ways can you spell
Cincinnati? And the answer is 13. 13. 13. Wow. So part of the job, what we offered AT&T is,
hey, we could clean this data up. And nobody seemed to be very interested in that either.
It was like this totally different universes, the sort of the old line, let's call it telephone service kind of thing.
And these new people doing things with this new technology, the web.
And so it's not the Letterman effect that Adam was describing, but it's the same sort of, gee, this is brand new and it probably isn't going to do any good.
And so let's not do anything much about it.
Yeah.
When presented with the future, it's often so novel that you can't understand what the
future is going to be, so you just shrug it off.
Right.
Yep.
Or for a while, we try to shove the present into it.
You know, it takes a while to have, that's why we talk about, you know, cloud natives
or web natives, people that grew up with the web.
They think about it in a different way than those of us who predated it and come to it and say, how can I apply my current perspectives
into this new thing, which generally produces some value.
But then there's the next generation or maybe a change in your own mind
to say, no, I'm going to think about it truly natively as a starting point
versus as a thing I'm coming to.
And that's usually where the creativity and the innovation takes off because you just think about it in a different way. And it's hard to, to shove
the present into the future. You know, you got to kind of build the future.
This episode is brought to you by our friends at Fire Hydrant. Fire Hydrant is the reliability platform for every developer.
Incidents impact everyone, not just SREs.
Fire Hydrant gives teams the tools to maintain service catalogs,
respond to incidents, communicate through status pages,
and learn with retrospectives.
What would normally be manual, error-prone tasks
across the entire spectrum of responding to an incident,
this can all be automated in every way with FireHydrant.
FireHydrant gives you incident tooling to manage incidents of any type with any severity
with consistency.
You can declare and mitigate incidents all inside Slack.
Service catalogs allow service owners to improve operational maturity and document all your
deploys in your service catalog.
Incident analytics like to extract meaningful insights about your reliability over any facet of your incident
or the people who respond to them.
And at the heart of it all, incident run books,
they let you create custom automation rules
to convert manual tasks into automated, reliable,
repeatable sequences that run when you want.
Create Slack channels, Jira tickets, Zoom bridges
instantly after declaring an incident.
Now your processes can be consistent and automatic.
Try Fire Hydrant free for 14 days.
Get access to every feature.
No credit card required.
Get started at firehydrant.io.
Again, firehydrant.io.
And also by our friends at MongoDB, the makers of MongoDB Atlas, the multi-cloud application
data platform. MongoDB Atlas provides an integrated suite of data services centered around a cloud database
designed to accelerate and simplify how you build with data.
Ditch the columns, the rows once and for all, and switch to the database loved by millions
of developers for its intuitive document data model and query API that maps to how you think and code.
When you're ready to launch, Atlas automatically layers on production-grade resilience, performance, and security features so you can confidently scale your app from sandbox to customer-facing application.
As a truly multi-cloud database, Atlas enables you to deploy your data across multiple regions on AWS,
Azure, and Google Cloud
simultaneously. You heard that right.
You can distribute your data across multiple
cloud providers at the same time
with a click of a button.
And the next step is try it today for
free. They have a free forever tier, so
you can prove to yourself and to your team
that the platform has everything you need.
Head to mongodb.com slash changelog. Again, mongodb.com slash changelog. So when I think about platforms, Unix and its derivatives,
and the web, for my money, are like two of the greatest platforms
ever created in terms of just opportunity and
uncaptured value, like people actually building things that change lives, etc.
And I think there's some common things between the two. Of course, one's built upon the other, right?
And it seems like Linux on the desktop never became a thing, but the web and web servers
and server-side code really made Linux, or didn't make Linux become a thing, but Linux
is entrenched because it was a great operating system for the web to run the server-side.
And so they're related and one built upon the other.
But in terms of Unix, whether it's in the philosophy or even in the implementation,
and the World Wide Web and its design and its philosophy,
do you see parallels?
Are there commonalities that we can look at and say,
these make for great platforms?
Yeah, that's a really interesting question.
And I think you're right.
I see some parallels that might even be instructive. I mean, fundamentally, it's the core simplicity of the thing. These are not complicated. They are simple. As we talked about earlier, the essence of Unix is a handful of ideas that work really well together. I mean, the hierarchical file system, the programmable shell, redirection,
not too many system calls. And interestingly, text is kind of the universal medium of exchange of information. Now you look at the web, what are, there's, if you want to call them system
calls, the kinds of things, there's HTTP, HTML, the URL, that's it. There is nothing else. Okay.
Yeah. It's got the internet as an infrastructure.
Oh, and everything that goes across the web is text. So that commonality there, I think,
is quite real. And, you know, Berners-Lee created that stuff, you know, kind of out of nothing,
but building on what was already there. And so he had a very clean, simple idea of what to do. Some of it was like HTML is basically a
dialect of SGML, which derives from GML on in the
past, but he simplified it and cleaned it up in a way that made it very useful
for this kind of application. So I think
there's actually quite a bit of parallel there.
This is why I think going back to your book, you wrote the memoir on Unix, I think is so important.
That's why I'm going to put it on my next list to read because sometimes when you look back, there's such fruit there.
In another interview with Lex, you'd mentioned in the early Unix days how there was so much low-hanging fruit.
That's why there was a lot of things happening.
And I feel like with the web, it's still, I feel like even though we're deep into into it i feel like it's still the beginning in so many ways and so to look back to unix and what it's become through linux and
others and just the underpinnings it is for all of this even the web itself is built on top of it
i think would make sense for somebody that's looking to the future because there's so much
you could draw about the future from the past despite our you know what
we had just said about how we can sometimes take our experience into the future and uh and use it
as baggage or it be baggage but i'm putting on my list i'm i'm excited to read this because i'm
obviously a fan of unix but to see how it might paint the picture for the future is is pretty
interesting and obviously the preserving the knowledge and just going back into the in the
past and looking at what has made what we are today foundational, I think it's pretty interesting.
I think that, I mean, there's a really interesting idea there, which I see from time to time.
People have gotten used over the last, call it 20 years or more, to graphical interfaces. The idea
that you, you know, look at something on a screen, you click a button with a
mouse or that sort of thing. And underneath that, there are an awful lot of things you can do with
a command line interface. And I think in various fields in various areas, people just rediscover
the idea of a command line that you can use to abbreviate common things, to automate processes,
to do things without you having to poke the buttons all the time.
And I see that in any number of areas where, you know, gee, I could write this little program based on text to process text that comes from the internet or whatever and do my job more efficiently
or more effectively. And so I think there's probably low-hanging fruit, for example, in that
sort of thing. Pick your area.
That does assume, though, that the computer becomes or continues to be a paramount point of, I suppose, creation.
I think we're in this unique space, and I don't know much about the future because I haven't been there, but there's a lot of creators that don't even touch a computer.
Right. creators that don't even touch a computer right you know in quotes creators and they tend to be
you know visual creators and things like that but they tend to like their only machine they use is
their smartphone or maybe an ipad and less of like say a linux machine or a macbook or something like
that i yeah i know that uh you use a macbook air to program i'm not sure if that's still true or
not but you know you're you're on a mac these even, so it's got Unix underpinnings in there.
I'm just curious how that plays out
because if the computer shrinks in terms of its usage,
do we still have access to the command line?
Can we still appreciate those original principles
that sort of drive things forward?
I think programmers and people in the software space
gravitate towards the computer
because that's where we have the most power.
But you see more and more people moving to things
and the command line tends to take a backseat
when handing out a tool to those operating systems.
Yeah, the iPad is a nice example of that in a way.
I mean, I have an iPad.
I turn it on maybe once every six months or something like that
because it's an utterly useless device because it only lets me do what Apple thought I wanted to do. And all of the,
and I don't know how to make it do most of the things because it requires funny artificial,
you know, wiggle your fingers while rubbing your elbow with something else to make something
happen, which with a command line interface, and this is the old fogey speaking, but with a command line interface, well, I could type two or three characters and
I'd be done. But I don't use it primarily because it is not programmable. It is totally useless
device to me. And a phone, I think, falls into that same category. And, you know, I use a phone
occasionally. I mean, I haven't turned mine on for several days but but you know right i can't
program it easier so not as interesting or fun yeah so i think this kind of ties together a
couple of threads that we've been hitting around one is the next generation of creators are growing
up with that phone and they're growing up with that ipad and that's what they know and that's
what they grew up with and so they are mobile natives so to speak and that's what they grew up with. And so they are mobile natives, so to speak.
And they're not super exposed to the possibilities
outside of that pane of glass.
Now, one thing I've noticed is that your most recent work
and what you teach now, this book you have,
Understanding the Digital World, it's not a programmer book.
It's a book for a lot of people.
It's a broader audience.
It's a different audience.
And I'm curious, as you've gotten older and more experienced,
it seems like your focus has shifted,
or you've changed your audience to a certain degree.
Who you're targeting to teach and to instruct and to influence
is not necessarily the programmers like us.
And I'm wondering where that happened.
If it was a conscious moment where you're like,
I got to teach regular people things too,
or if it's kind of this,
because the next generation may be not programming,
they may be on an iPad,
but you can influence them to say,
hey, did you know there's a whole world of possibility
that you're not experiencing
because you don't have a PC or a MacBook
with a command line?
Yeah, the book that you're not experiencing because you don't have a PC or a MacBook with a command line? Yeah, the book that you described, I wrote it actually for a course that I've been teaching
at Princeton for the last 20 odd years off and on. It's a course for people who are very
non-technical. There's nobody in it who's probably ever going to do computer science.
Yeah.
But they're growing up in a world where computers and communications are obviously pervasive, and that's changing the world extremely rapidly.
It's accelerating.
And I think it's important that anybody who thinks they're an educated person ought to know some of that stuff about how do computers and communications work and how do they affect people
and what can they do about things like privacy and security and defending themselves in various ways?
And you could see the effects everywhere.
I mean, think of the social media, mostly bad, occasional good, the advertising industry, cyber, whatever, all of these things.
So the kids in my class who are the non-technical ones in this class, they're going to be in positions of power and authority and influence to a degree
which let's say the kids in computer science are much less likely to be actually and so wouldn't
it be nice if these folks knew something so that when they're running the world 20 or 30 years from
now they don't make silly mistakes about technology or at least they are better able to assess what's
going on and make better decisions so that's the hope of the book I got into
the whole thing kind of by accident I spent a sabbatical sort of you know one
semester I was still working at Bell Labs but in 1996 I spent the fall
semester at Harvard teaching CS 50 which was this big course introductory course
for pretty much everybody
who wanted to learn anything about computing. And so I did it as a visitor one semester.
And what I discovered in that class is there were lots of kids who were very capable at programming.
They're the ones who'd started programming when they were basically five years old. But there were
lots and lots of other people who had no insight whatsoever into computing and would never probably be computers, but had to learn something about it.
It was in some way satisfied a requirement.
And it was hard to have one course that would satisfy that broad population.
And so when I some years later wound up at Princeton, I thought, why don't I try and teach a course for the non-technical end of that broad spectrum of the kids who are history majors or English majors, that sort of thing.
And that's the genesis of the course.
So it's been a lot of fun.
I mean, it really is fun to try and explain, you know, the kinds of things that you and I would think are interesting and fun and important and all that stuff to people who come from very different backgrounds and may not appreciate why we think it's important or fun or relevant.
How do you impart that?
How do you bridge that gap in your classes?
Any techniques?
I think the way you do it is that there's a sort of a framework of stuff that I think
they should understand.
You know, what is computer hardware?
How do computers actually do their thing?
What is software?
What does that mean?
What's happening when you put an app on your phone?
And then the communication stuff, the internet and the web and all those.
So there's that technical substrate underneath it.
And then there's the, okay, but how does this show up in the real world?
What are the things that the real world is doing to you that are dependent on the way that that technology works. And so,
you know, advertising and what is called the surveillance capitalism, I guess, is a fine
example of that. I get the kids to start up their computers in class and open up a browser and count
the cookies. And the standard response is basically, oh my God, I can't count them because they've got thousands of cookies.
So you explain to them what's going on, how they're being tracked, how Facebook knows more about them than they do.
And, you know, after a while, they start to maybe remove some of those cookies or disable the ones that they don't need or that sort of thing. You can talk about cryptography, explain why government attempts to put backdoors into cryptography
are a desperately bad idea.
And you don't need to know the mathematics of it to realize that,
you know, this is not going to end well if you allow that sort of thing.
So getting them to understand the idea behind Moore's Law
and whatever might replace it, the fact that
things will continue to get smaller, cheaper, faster, better in unpredictable ways, and
that will continue to have an effect on their lives.
The Internet of Things is an example of that today, that there's all these little devices
watching you and talking to each other and telling the world about you in a way that
you may not want.
So it's a combination of actual technical content made at a level that I hope is accessible to them.
But then how does that relate to the world that you live in?
As we're talking about this, and you mentioned your iPad and how unused it is every six months.
And I'm assuming it's a smartphone. I'm not sure if it's an iPhone or not, but your phone, how you maybe check it out every couple of days.
The conclusion I kind of draw from this is almost a world of obedience and the possibility to be
rogue. And let me explain that because on an iOS device, let's say, and I'm not trying to say that Apple is being malicious with this activity, but the functionality of the device is definitely limiting.
You can't program it in itself.
You have to leave the device, enter a computer world with a command line, with the Unix underpinnings and the Linux backgrounds and the packages and this whole world of possibilities to make the other thing work. And so if you only stay, Jared, to your point where there's this sort of mobile native aspect, if you only ever stay in that mobile native, it's almost a world of obedience.
Like obey us and use the device as we see fit.
And Brian, you mentioned the cookies.
Well, try to find your cookies or count them on an iOS device without developer tools.
You can't.
So you don't know what's happening beneath the system.
So you accept it and you just sort of obey and use.
But in the world of a computer, a full-fledged computer where you can actually make it work and program it, you have way more control on your network. You run a pile hole that checks
for cookies or blocks certain URLs or disables those ad tracking abilities. And so you have
more control over your digital presence on that device. That's just an interesting,
I just thought about that as we're having this conversation. What do you think about that,
Brian? What do you think about that, Brian? What do you think about that parallel between obedience and, I don't know if I would
call it rogue, but just maybe freedom? Right. No, I think that's spot on in many ways,
because certainly the devices you get, and especially your iPhone, I don't have an iPhone,
it's kind of a walled garden. The idea is that you get in there and it's a very, very nice
environment and it does all kinds of things very smoothly, but you can't get out.
And in particular, no, you can't write code for your iPhone.
Even if you're a programmer, you have to stand on your head
to get code to run on more than your own personal phone.
And even there, it's probably restricted.
I haven't done that for years.
So, yeah, you're supposed to live within the confines of whatever that system is and then
they can do in effect whatever you they want to you and most people are not aware of it and
there's not aware that there might be something outside and so the notion of freedom is kind of
unclear if you don't know that you're actually kind of locked up in a way inside this nice wall of the garden yeah so you mentioned now how you're teaching all these students who aren't cs students you know
maybe they're in going into law maybe they're going into business or medical or these other
industries and you say like politics for example like these might these will be the leaders of the
next age and they're not software people. What's ironic is today, the most influential,
powerful people of our day are software people. I mean, it's your Mark Zuckerberg's and your Jack
Dorsey's and this whole group of Silicon Valley entrepreneurs and software folk who have,
I don't know, by pure chance and luck or by just the motions of capitalism and the web,
a free platform, a free and permissionless platform,
they've kind of sucked all the air out of the room
to a large extent.
And so they have this power,
which I don't think any of them necessarily asked for,
but it kind of came upon them.
Maybe they desired it.
Who knows?
I'm not going to psychoanalyze these guys.
But it's just an interesting fact of history that that's where we stand. And now we have a next generation of
programmers, many of which you will never teach, not because they're not in your understanding of
the digital world class, but because they're opting out of universities altogether. So many
programmers today are going the boot camp route. they're going completely self-taught online
and they don't have a four-year degree and they're never going to get a four-year degree
because it's too expensive or they want to move fast and break things or whatever it happens to
be curious your take on this trend away from computer science degrees and universities and
towards online learning coding boot camps and this kind of like short circuit
into the workforce. Yeah, I, again, it's one of these things where I don't think I know enough
about it to have a really informed opinion. But there's absolutely the trend you see. I don't know
boot camps were very much in on my radar, three, four years ago, I don't see as much of it now.
And that may be because I'm not, my radar is aimed somewhere else uh and so maybe they're just as active there's certainly
enormous opportunities for online learning although empirically people who start online
things often tail away very very quickly the enrollments in many online courses decay
exponentially after the first couple of lectures but it isn't to say that that isn't a viable way to do things.
I think there are several things you get from going to,
let's say, a four-year college or something like that.
If you do that with focus on computer science,
you learn more than just how to write code.
You actually learn a bunch of other things that might be germane,
like, gee, what's a better way to do something?
You learn something about algorithms. You may have a better understanding of what's underneath the various pieces of your
stack you may start to see you know python that's really sort of an interpreter and it runs fast
because underneath it it's it's so there's a lot of things that you might not see if you haven't
encountered them in courses places where you're sort of forced to try and understand them.
So it's not to say that one is better than the other, but there's maybe a different level of experience.
And, of course, another thing that other friends, acquaintances, business associates,
whatever's significant others for your life going forward. And one of the advantages of
university is that you meet people who are not the same as you. And so, I mean, one of the things
that I like about that course that I teach is that I meet these people who are history majors.
Okay, I'm not a history major. I find it interesting. And so it's really, I think, very valuable and important to deal with people
whose view of the world and how they do things and what they find interesting and what turns them on
and all that is just different. And I think that broadening experience is something that you
probably would miss if you, you if you went straight into a boot camp
and then straight into doing a startup with five or six other people
who are exactly like you.
But different strokes for different folks.
I'm not saying that I'm the right way to do it either.
Sure. It's interesting to think about a world where this changes
because it reminds me of the process of making, let's say, tea, for example.
Tea, you steep, you immerse, you fully immerse something in something for a duration of time.
And say espresso is not, obviously they're two completely different drinks, but the process to
create is, they're both consumable liquids basically, is the similarity at least. With tea,
you may steep it for five minutes, 10 minutes.
Some tea takes a good 10 minutes, and it may take a high degree of temperature, a lower degree of temperature.
Or espresso takes a high degree of temperature, but compression and the time is condensed and still out the other end.
You get this liquid, both consumable, both with caffeine, both with similar attributes to the consumer.
And that's what it reminds me of is like this.
You can get to the same place similarly with different paths.
So back to your different strokes for different folks.
I do agree with that.
I just wonder how we preserve, you know,
wisdom like yours and others who, who can be in an environment,
like you're not going to ever not teach at Princeton.
Like, would you ever eject yourself from that environment and go to Brian Kernaghan's
website and subscribe directly to him and he will teach you directly?
Will you ever eject from that and go into the basically direct-to-consumer
model? You're in a package-good scenario where you package
Brian up and you put him into a class. You're a package-good, Brian.
Right. I mean, if you think about the analogies, right, like that's, you're not a direct-to-consumer
teacher.
Yeah.
How do we preserve this non-direct-to-consumer teacher area where there's still wisdom and
reasoning for it, this idea of steeping?
I know I kind of went way in the weeds of explaining that, but like, how do we preserve
that?
How do we keep the need for that in this future
i know you don't future or hypothesize about the future a bit but i'm sure you got some ideas there
yeah i i don't know i mean it's clear that that the steeping analogy let's say for your university
is one of many ways to achieve education and not just technical any kind of education sure and i
think probably what you want is something that makes whatever pathway
is going to work for somebody readily available,
that they don't get cut out of it for financial reasons
or discriminatory reasons of any sort, that all these things,
what you'd like is this ideal of kind of equal opportunity
and people go through whatever process makes the
most sense for them. You've seen from time to time discussions about, for example, whether it's better
to go to college and get a degree in something or better to go into a trade, you know, like plumbing
or something like that. So, for example, you can be something like a plumber, get out of high school,
go through an apprenticeship, learn a trade, become very good at that, actually make a fair amount of money.
And for many people, that might be a viable alternative and perhaps even more satisfying than going off to, let's say, a four-year college.
Two-year college is some kind of intermediate position in that.
I think the main thing is to make it so that anybody can understand what the options are and find a pass through where the
options make the most sense for them. And I suspect there are a lot of artificial barriers
that the country would be better off if those barriers could be reduced. And I don't know what
they are for different people. Certainly going to a private school like Princeton, an expensive
place, that costs money. And so there's a financial barrier to that.
Some places solve that with student grants. Other places, you have to take out loans and then you're
stuck with debt for some period of time afterwards. So I don't have a good solution. I think it is
important to make it possible for people to go in whatever direction seems to work best for them
and to have a clear idea of what the trade-offs involved are.
Well, I mentioned our tech oligarchs. People talk about late-stage capitalism. Maybe this is like
late-stage World Wide Web because there has been a consolidation of power and value capture. I think
there's still a lot of opportunity on the web. That being said, there is a group of people
far and wide who are trying to rethink, reinvent, change the web. They've dubbed it Web3.
There's a lot of particulars on this topic that we don't necessarily have time for,
or the interest, but this idea of decentralization, do you think there's a nugget there
that could fix some of our problems? Do you think it's a red herring or a grift?
What are your thoughts on the decentralized web and kind of rethinking the web somewhat fundamentally?
Yeah, I mean, Web3 strikes me as being just another buzzword.
And so I have to look it up to know what people mean by it.
I think the idea of decentralization in some ways, I wonder whether that's going back to the way it was in the good old days when the web was decentralized. Yes. When we didn't have these concentrations of power, the tech oligarchs that we have,
like, you know, well, let's say the folks from Google or Facebook or whatever.
I would say decentralization in that form would probably be quite a good thing.
Whether it requires a technical mechanism like blockchains that just strikes me as kind of adding trendy
things together to get something that's purportedly even more trendy i would just i i don't see that
at all period leaving aside the environmental impact of computing to make blockchain stuff work
so color me pretty skeptical but you know maybe, maybe I could be convinced. Maybe it's
another one of these things where the future was before me and I didn't see it. So we wouldn't be
respectable of your time. We're getting to the end here. We're going to let you go. I have a few
real quick, quick hitter listener questions. If you don't mind, I grabbed three of our listener
questions that people submitted that I think you should be able to answer pretty quickly. This one comes from Saul. He says, do you still enjoy programming? And as a follow-up, what's a
tool that has been created in the last 10 years that you like? Yeah, well, yeah, I do enjoy
programming. It continues to be fun. I think the problem I have is that most of the programs I now
write tend to be quite small. They are often, you know, aqua liners to do something or maybe a Python program to clean up data in
one format and convert it into some other data. Not doing anything that I would call
big or anything like that. Are there tools from the last
10 years that I use in that respect? I would say
on average, no. I think, and that's probably
just because the stuff you learn when you're young
sticks with you better and so i have a lot of things in my head and at my fingertips that
let me get things done and kids in my class look at me and say my god that's just dinosaur like
and so so i am not a early follower I'm a late adopter in many respects.
Early abandoner.
Early abandoner, exactly. I like that. Coin to phrase. All right, Chris Shaver asks,
are there other languages besides C that you admire and why?
Yeah, that's a neat question. You know, I suspect like most programmers, there's sort of half a dozen languages that I'm comfortable in,
but it's now enough that I have to, when I switch from one to another, I have to kind of get myself back.
It's like driving on the left, driving on the right kind of thing.
And then there's another half dozen that I have touched but don't remember enough about,
and then there's another half dozen or a dozen where I think, gee, wouldn't it be nice if I knew those, but I never will.
The language, leaving aside C,
certainly the two that I use most are awk and Python
just because that's getting the job done.
I have from time to time written Java programs,
and Java has its merits,
but sometimes it's like walking through glue to get stuff done.
Good analogy.
It's just there's a lot of syntax in Java.
I have tried functional languages, and there's some mental barrier.
I can't get over it.
I can write some kinds of things in minutes in C or any imperative language,
and if I'm stuck in a language like Haskell, it'll take weeks.
So it's hopeless.
Sorry, folks.
Fair enough.
Last one for you.
Will Furness asks, what are your favorite tech books that are not written by you?
And as a follow-up to that, what makes for a particularly good tech book?
The one that I come back to from time to time and i'll pick on just the one is the mythical man
month by fred brooks because i mean it's a very old book at this point it probably dates from what
the early 70s or something like that and there's an awful lot of it that is very dated in the sense
i mean for example all the programmers are male and all of the clerical people are female. And that's just, sorry, that's wrong.
But there's a lot of insight in it as well in what's involved in trying to get software that works,
especially if it's a large-scale thing.
And so I find that interesting, and it's well-written.
It's gracefully written.
And so that's one of the books that I go back to from time to time.
The other one is not so much a book,
but it's on my radar right at the moment
because I've recommended it to a couple of kids
in the last 24 hours.
Dick Hamming, who was a colleague at Bell Labs for a while,
he was in the office next to me
in my first intern summer there.
He gave a talk in, I think about 1986,
called You and Your Research.
And it is basically how to make the
most of your career. And it's so it's a, you know, kind of an hour talk. It was transcribed, get the
transcription from the first outing and then watch them on video of the later outings, I guess. But
I found that a very, very insightful way to make the most of what you've got to have, you know, a good life,
basically in technology, but not exclusively your technical stuff. And so it's not a book.
He wrote a book, which I have as well, a lot of which is in the sort of thing that's in that,
but it's kind of a distillation of a very effective career by a guy who really optimized
his own way through life. So those
would be two that I think are particularly interesting. In particular chapter that stands
out to you, I know when I reference books that stand out to me, like this book stands out to you,
is there any particular chapter that you reflect on or go back to that you can point someone to?
If they were like, I just want to check out one chapter, maybe two, or maybe it's a section.
Is there a particular chapter
that sort of grabs you
that you go back to often?
In other people's books?
The Mythical Man month in particular,
the one that you mentioned.
No, it's not a long book.
And so I probably would skim the whole thing.
And you'll find things that are just so dated,
like how do you organize the software
and the documentation for a big project?
This was before stuff was stored on disk, roughly speaking. So it's that dated.
But no, I wouldn't focus on any particular one. I haven't looked at that one for a couple of years
at this point. So not in my head in the same way.
Gotcha. Okay. Anything left unsaid, Brian? I know that you interview here and there
frequently, but not too frequently.
Is there often a question that you're like, man, I really wish they would ask me that?
Or anything in particular you love to talk about, but people don't often get to ask you that and you're just bummed out.
Don't leave the show by doing that.
So if there's something, say it now.
If not, then we'll call it a show.
We don't want you bummed out, Brian. No, no.
I think you guys have covered it pretty well.
And there's a lot of interesting stuff.
And we've touched on a big part of it.
So, no, I think that will do for the moment.
We appreciate your journey to get here.
We appreciate you sharing your wisdom through books and through your teaching.
We just appreciate you showing up today.
So thank you so much, Brian.
My pleasure.
It's been an honor.
All right. That is our show for this week. We's been an honor. If you dig this podcast, remember, we have other awesome pods like Ship It with Gerhard LeZou.
He recently had Kelsey Hightower on the show, and it's a banger.
Here's a taste.
But you have to understand the fundamentals, the boundaries between these concepts.
So I think as an industry, we've been pushing automate, automate, automate.
And we haven't been saying understand, understand, understand.
Because if you understand what you're doing, you can automate if you want to.
And sometimes I've seen teams where maybe
you don't need to automate as much anymore.
Listen to the entire conversation
at changelog.com slash shipit slash 44.
And if you're FOMOing over our extended episodes
like this one, the ad-free feed
and higher bitrate MP3s,
head to changelog.com slash plus plus.
It is better, as they say.
Thanks again to Fastly for partnering with us all these years,
to Breakmaster Cylinder for keeping our beats farm fresh and glitched to the max,
and to you for listening.
We appreciate you spending time with us each week.
That's all I got.
Thanks again.
We'll talk to you next time. Thank you. you