CoRecursive: Coding Stories - Chat: Why still 80 columns?
Episode Date: June 1, 2022On June 1st, 2014, the following question showed up on hacker news: "Why is 80 characters, the standard limit for code width." You probably know what happens next. People started to post their opini...ons and the comments and other people started to disagree. The posts spread around the internet.  So that is going to be today's show: Let's answer this question.  It's a question about traditions and teamwork, and how preexisting idioms shape us and help us, but sometimes restrict us. Episode Page Support The Show Subscribe To The Podcast Join The Newsletter
Transcript
Discussion (0)
Hello, this is Co-Recursive and I'm Adam Gordon-Bell.
Do you remember 2014? Germany won the World Cup and the ice bucket challenge was sweeping Facebook.
People pouring buckets of ice over themselves to raise money for ALS.
But also in 2014, in my favorite hangout, Hacker News, an argument broke out.
I know, arguments on the internet, big surprise.
But on this day, June 1st, 2014,
a question from user Fred Lee
on the software engineering stack exchange
became the number one post on Hacker News.
So you probably know what happens next.
People started to post their opinions in the comments
and other people started to disagree. The post spread around the internet. It got tweeted and
retweeted. Eventually the bloggers joined in linking to it and posting their own takes.
By the time the discussion had cooled off, 630 pages around the web linked to this question. Do you want to know what it was? It was not
tabs versus spaces. It wasn't Vim versus Emacs or Mac versus Linux versus Windows. It was this.
Why is 80 characters the standard limit for code width? Why 80? Why not 79 or 81 or even 100? So that is going to be today's show.
Let's answer this question.
Not just why is it 80, but should it be 80?
And to share my biases right now going into this,
in my VS Code, I have a line drawn at 100 characters.
And, you know, this line is lava.
Nothing can touch it or I need to break up the line.
But should you have that line at all? And should it be at 80 characters? you know, this line is lava. Nothing can touch it or I need to break up the line.
But should you have that line at all? And should it be at 80 characters? It's a question about traditions and teamwork and how pre-existing idioms and the culture of software development,
how it shapes us and helps us, but sometimes restricts us. Yeah, it's all contained in this one question.
But I'm getting way ahead of myself here.
I should introduce my guests.
Joining me, I have my Apple-hating neighbor, Don McKay,
and my smarter-than-us-all,
don't-ask-her-about-elliptical-curves friend, Crystal Mon.
Say hello, you two. too hey how's it going
nice to be back yeah i don't hate apple i think you've you've invited me onto your podcast because
you missed the like arguments we used to get into so before we discuss where this 80 character limit
comes from i'd like to just talk about whether it's good or bad so you know what we're talking
about right some people don't want the lines of code that you write or comments or whatever to be
wider than 80 characters. What do you guys think? 80 characters for me personally is fine,
but I prefer more characters just because widescreen monitors have become a lot more
common now. But what if you're one of those programmers who codes on their phones?
I do not do that.
There's an accessibility type argument you're saying, Crystal.
Yeah.
You can't just be like,
oh, of course every developer has a giant 4K.
But they should.
I've been called out on this before.
This person joined the team was using Vim
and they had a very specific setup.
And we just didn't enforce these code widths but
then all of a sudden like he would have trouble like scrolling over to see these long characters.
How long was your lines there Adam? Like were they excessive or just they just didn't fit his
smaller format? The thing about wide code is it's not constantly excessive right? It's just like oh
there's the occasional line and I think this would just look better if this if, you know, went way, way over, right? It will never look better if your if just
goes way, way over. So on Hacker News, this debate shows up and some of the people were less afraid
to tell us what they think. In my experience, there's a very strong correlation between good
developers and short lines. Bad devs don't mind if some lines are 150 characters long and require
horizontal scrolling they also don't care much about consistent naming of symbols or having
correctly indented code there are very strong reasons for keeping your lines short your naming
consistent and your indentation in check but at the end of the day you either get it or you don't
in other words he's like 80 characters are correct and good devs know this innately. They were born knowing it.
And I guess he's also saying, like,
this is like a key signal to see whether you care, right?
If I look at a code base,
and they don't have 80 characters as a limit,
then I know they don't care.
Somebody else could say, like, oh, I work in Java Spring,
and some of the variables are 80 characters long, right?
Like, if I have an 80-character variable name, like, how do I make any long, right? Like what do I,
if I have an 80 character variable name, like how do I make any changes to it?
So what do you do then? And then someone else mentioned if you're in C sharp or in Java, you open the package or a namespace and then you open the class and then you open the function
and you have an if inside of that. If each of those is four spaces, you're already
four times four, 16. You're already 16. That's a good is four spaces, you're already four times four, 16.
You're already 16.
That's a good point.
Yeah.
You're already like way in, right?
So you're already losing all that indent,
which is why user Mantrax5,
awesome name, by the way, says,
I use 120, plenty of space to fit two code windows
side by side and one file browser.
With HD screens and the verbose languages we use these
days, it's a bit silly to try to stick to 80 characters. Another user, Raverbashing, has said
it's stronger than that. Just confirming that 80 characters is idiotic and an arbitrary standard
propagated by nothing much more than cargo culting. Oh, but it fits my screen. It fits two by two on
my screen. Well, change your font size then.
This argument just goes on and on.
Any place that developers are gathering
or talking about their code,
somebody says it's absolutely this.
Somebody says this is dumb.
We're going to go deep on this topic.
Have you ever heard of Chesterton's fence?
No, random.
Okay, perfect.
So there was this guy named
G.K. Chesterton.
He's like a long dead Englishman.
He was like a contemporary of
C.S. Lewis. You know, C.S. Lewis was like
the Narnia books.
So they were sort of like the
influencers of their time,
except like being England in the
1800s. Instead of posting hot takes
on Twitter or writing blogs on Substack,
they would like write letters back and forth.
You know, I disagree, da, da, da, da.
Like it's just the same as Twitter basically, right?
Just it took a lot longer, right?
So Chesterton's fence is this argument that he has
that was later called that.
Chesterton says in some essay,
imagine there's a fence blocking a path
and the fence is in people's way. The more modern type of reformer goes up to the fence and says,
I don't see the use of this fence. Let's clear it away. To which I reply, if you don't see the
use of it, I certainly won't let you clear it away. Go away and think. And then when you come
back, tell me what you do see the use of it, and I may allow you to destroy it.
It's kind of a strange way to word things, but it makes sense.
He's saying you've got to figure out why people did something a certain way before you stop doing it.
If you think the 80-character rule is stupid and you say, let's stop following it in our code base,
Chesherson would say, we'll find out where the rule comes from, right?
And then you can decide on the basis of that,
whether that rule still makes sense.
I assume in the case of the fence,
you wouldn't want to remove it
and then find out like,
oh, it was blocking an area
where the ground is unstable.
Because there might be an important reason for this rule.
Maybe your compiler just stops reading things
at the 80th character.
I don't know.
And Chesterton says, go find out first.
And so, Don, you know, maybe you would blame Apple.
I don't know.
You're trying to get, like, the Apple fans to, like, come after me.
So what is the equivalent of, like, being blocked back in the day?
Like, what is the equivalent?
Oh, like, how do you get canceled in the world of, like,
writing essays and letters back and forth?
Oh, that's a good question.
I mean, truthfully, they were a lot more conservative back then.
Maybe you would get drummed out of the country.
They'd be like, this person's views are incorrect.
Lord Byron, who was like a poet and influencer.
Influencer.
Yeah, he was like whatever, English influencer.
And he would read poetry out loud. And like women would like... Swoon. Yeah, they was like whatever, English influencer. And he would read poetry out loud and like women would like...
Swoon.
Yeah, they would swoon.
Anyways, he was kicked out of England for, I believe, I think he was gay.
Oh, wow.
Things were much less understanding back then.
Yeah, so 80 columns comes originally from IBM punch cards. And I'm sure at least one person
listening already knew this because yeah, the Stack Overflow question dates back to 2014. And
as I said, it's been shared a lot, but it's crazy to think about to me, if you can imagine computers
existing before computer monitors or keyboards existed, like on a factual level, I know that's
true, but it sounds
super strange, right? Like all my computer interactions have been mediated by keyboards
and monitors. I mean, sometimes large CRT monitors, but still, it's just a weird thing to think about.
So before they existed, you had to input all your data or code into computers using punch cards.
And punch cards, like IBM punch cards looked kind of like recipe stock cards,
but twice as wide,
like kind of the size of a page of a paperback book
if you turned it sideways.
And the way the programming with punch cards works
is that each punch card equals one line of code
in your program.
And a standard punch card is a grid
where you have 10 rows down and 80 columns
across. And so that's where the 80 character limit comes from. You would write your program out like
in longhand and paper, and then you would punch it into the cards. And so if you went over 80
characters, you were dead. There was no more lines on the punch cards. This is like a very hard rule. We've kind of appeased
Chesterton and his fence now, right?
We know where the 80 characters
comes from.
It clearly comes because
you couldn't fit anymore
on a punch card.
What does that mean?
So we can remove the fence,
I guess.
We're definitely not
using punch cards now.
So the purpose of the limit
seems to be gone, right?
I mean, it's pretty restrictive
if you're working
in an environment
that anything over 80 kind of breaks your whole gone, right? I mean, it's pretty restrictive if you're working in an environment that anything over 80 kind of breaks your whole system, right? Maybe be more adaptive. I don't
know. Shots fired. It's like that one guy was saying, right? Like reduce the font size. We
have so many tools and options available now, right? That we don't have to be locked into like
one specific convention. But I feel like not everybody has those tools.
Yeah, like we do have standards.
Like I think our maximum line length is 240.
But you can't have crazy Atom lines.
I think 240 is a crazy Atom line.
This reminded me, one of my mentors, he was telling me that they actually had to use punch cards to do their homework.
And so like they're kind of grouchy because they're like, oh kids are spoiled with your control z and your ide's and stuff yeah i don't know that that seems to be like a common human
thing right like i suffered so you have to suffer so the 80 character limit may in fact be a leftover
form of hazing yes so i don't want to give up on 80 characters so soon just because we figured out
where it came from. But punch cards
is like a super interesting thing to like take a little bit of detour. A lot of stuff that we
do in computing now actually came from punch cards, but you just don't really notice it.
So the delete key is an example. Do you guys know how the delete key is sent to your computer?
No. If you imagine your punch card, that's like 80 columns by 10 columns, right?
So each row is going to be a single key of your Fortran or COBOL program or whatever.
If you want to type in the letter A, the character for the letter A in ASCII is 97.
Punch cards work the same way.
That 97, you convert it to binary and it gives you 110000001. And so you would punch
the first hole, the second hole, and then a bunch of blanks and then another one. If these old main
frames spoke ASCII, which they didn't, but you can imagine this would take a long time. So they had
something called a key punch. Maybe you saw that crystal at the computer history museum. I don't
know. The key punch is like a typewriter, but instead of paper, it took in these punch cards.
And when you type the word A,
it would punch out the holes on the current row
that were needed for that.
So you wouldn't have to do this kind of lookup, right?
This relates to delete keys, I swear.
Like we're going to get somewhere.
So they didn't use ASCII, which is seven bits.
They used eight bits.
So it's basically like eight of the holes for each letter.
So this became the byte.
The byte is eight bits.
And this became the unit of computing
because before that time,
computers varied in a lot of different ways.
Like each new computer might be different.
Some might be eight bit, some might be 12 bit.
And four bits is a nibble.
This caused them to standardize on a byte
because that was how you could represent a character of text.
Each row on a punch card became a byte.
If you were inputting text, a punch card gave you 80 bytes.
But here's what's wild, right?
That's still how we do things today.
We went from a time when this was undecided.
There was 12-bit, 8-bit nibbles.
And now everybody uses bytes, right? Which, if you think about it, is a very strange way to describe all space for
everything. That's just weird, right? A gigabyte is a billion text characters. If you were to put
that onto punch cards, a gigabyte would be 12,500,000 punch cards. Wow. That's a lot of cards.
Do you think they would get one person
to punch all those out?
Like imagine it's just like a JPEG, like a meme.
It's just a meme.
So I promised this would all relate to the delete key.
The 8-bits that you used on a punch card
was called the Extended Binary Code Decimal Interchange Code.
That's really easy to say.
EBIT?
If you make a mistake in it,
you typed a typo on your card, right?
You hit F when you meant A.
You can't fill back in holes.
Once you've punched holes...
Yeah, that's true.
The only thing that they could do
was punch out all the other remaining holes.
So if you accidentally do F instead of A,
you just punch out all the rest
of the holes. In this 8-bit mapping, all eight characters being ones was called eight ones,
which I suppose is a good name for it. Or in ASCII, where it's only seven bits, the flipping
all seven ones on is the highest value in ASCII. And that's a delete key. The way delete keys work today,
when I press a button on my keyboard, that's delete and sends the value 127, like it sends
seven bits all flipped to one. And the reason for that was that was the only way to erase something
like on a punch card was to punch out all the remaining holes. The ASCII value of delete is
based on having a way to like delete a column.
And it's even the same in like Unicode. So if you're sending people like whatever taco emojis
or I don't know what the right emojis people are sending. If you want to delete that, you're
sending the 127 value. I mean, it all dates back to punch cards. And it's all just because you
can't unpunch a hole in a card. Whoa, that's cool.
Here's how it relates to our original 80 character thing. We don't need
delete to be all the holes punched anymore. Like we didn't need that in Unicode with taco emojis
and hearts and troll faces. Like we don't actually care about punching holes. But there was a lot of
things that assume that delete is like seven ones so the original reason like chesterton's fence is long gone
but it led to other reasons i don't know do you guys do you buy it i went to the computer history
museum it's really cool you know we think about 80 characters and like that being tradition but
even back then they were trying to think about different ways of representing
that would be more like the way we as humans process information and we kind of accept these
things today you know this is how a computer is right yeah and so there's like more of this focus
on like incremental change these days and like kind of accepting a lot of things and when you
go into the computer history museum what's really cool is that they were like radically redesigning things that we kind of take for granted today.
It's just like wild how people were just like, oh, I'm going to use like this completely different
thing and call it a computer. And yeah, I wish you would do like more of that because, you know,
it's like pushing limits you know i mean i guess
the big question here is because something's a tradition should you throw it out and rethink it
or should you persist i think it's situational like if you have one of those reasons where you
need it to be 80 characters then obviously you can't but if you don't have those restrictions
why not let's muddy the waters here i've always wondered why the same people who say like oh
variable names should be long and descriptive then like when they want to count to five they go and
they write out a for loop and they write for i equals one i less than five i plus plus plus i
mean that's i is not descriptive right like that should be like foo, counter, blah, blah, blah. So why do they choose I?
And if you push them, then they'll say like,
no, I is for incrementing.
You know, incrementing I is fine.
It's an adjacency matrix or something.
And then if you push them further,
they're like, well, J is fine too.
K is okay.
Because, well, the way they teach computing too
is kind of like you're supposed to like just
follow the things you know like there's just there's so much about the way we think about
computer science right there are all these kind of things that are traditional ways of coding that
everybody goes through and we don't really ask questions about like why we're doing things a
particular way because they're just like you're learning learning. You're not supposed to ask questions.
Just do the thing.
Learn how to write a for loop.
It doesn't make sense, but this is how we code.
I remember when I was in school learning how to program and doing loops.
And yeah, it was very much like you have to write it like this.
If you want to do a for loop, I is the thing that you use, the variable you use to maintain the count.
But at some point, I'm like, I found out that you don't have to name it i blew your mind right like i didn't even know
right i went through most of my school like most of my school days just doing like the i and then
eventually one day i i found out i'm like wait it doesn't like it still works if you don't have it
named i like there's nothing in the actual language itself that says that you have to like specifically call it I. And I'm like, well, why am I naming it I? Like I could have
been naming it something else this whole time. I was very like outraged at the time.
Yeah, exactly. Eventually I learned this rule that's much better than the use descriptive
variable names, which somebody told me, which was that names should be equal to their scope,
which makes a little bit of sense. It implies if you have a little for loop,
you can call it I because the scope is very small.
But that doesn't explain why it should be I.
Why not X or A?
Yeah, foo or bar.
Yeah, right?
Foo, bar, and bas, those are more things
that we just carry forward without questioning.
But they're idioms, right?
These are programming idioms.
And as you said, you learn them.
People wrote for loops like this in C and then move forward from there and people just keep following in their footsteps. I always think that like people writing books should kind
of, I wish that they would include those kinds of things in there. It makes it so much more
interesting, right? If they can say, here's why, but so I know why it's an I now. Of course.
So my assumption, because I don't know the answer to why it's I,
but my assumption was always that it stood for index.
Yeah, me too.
And like J and K were just because, well, we already have index and it's I,
so I guess the next index will have to be J.
Like I always just assumed it was one of those kind of things.
Okay, so you're not far off, right?
A lot of punch card programming
and mainframe programming that followed,
maybe not using punch cards, but just on mainframes.
It was using Fortran and COBOL.
And if you have 80 characters or even less per line,
you want to be concise.
So Fortran came up with ways to kind of make each line concise
because you only had so much width.
One of these things was that they assumed that all variables would be real numbers.
So like 2.3 or 5.6.
So you didn't have to declare them as reals.
They would just be real.
Except for I, J, K, L, M, and N.
Those were always integers.
You didn't have to declare them.
I is just an integer.
Oh, for integer.
So if you needed to do a loop,
you would just grab an I
because that is the first integer that's available.
So they just set aside six values to be integers.
And in fact, I believe it's any variables
that start with those.
So if you create a variable that starts with I,
it's going to be an integer.
And then people just want to be short
because they only had 80 characters.
So they're like, well, why would I type more than just I?
And then once I use that,
the next shortest integer I can use is J and then K.
So that's why.
The reason for using I dates back to Fortran.
And it was just the first integer value available.
And this whole thing is the root of what I guess must be
one of the first
programming dad jokes which fortran people would say god is real unless declared as an integer
oh no because the variable god would be considered a real number unless you said it was an integer
so now you know that the i is like somewhat arbitrary but just because you know it's
arbitrary and no longer makes sense
because I'm not programming in Fortran,
doesn't mean I'm going to start going for
for atom equals one, atom plus plus,
well, atom less than five, right?
You could name it whatever you're looping through.
You know, row or user or index or something, right?
You could actually name it whatever you're looping through
and then it makes sense.
Remember, you have to write things
for yourself three months from now
when you have no idea what this code is.
Let's say I'm doing a transpose
where the element at i, j
becomes the element at j, i.
He's making elaborate hand actions.
I can call, I can do like for row
and then for column.
I'm not sure it's better, right?
I know the I and J,
like that's just burnt into my head. It's not, I and J is only better because everybody knows
this idiom, right? Like it's burnt in. It's cross disciplinary too, right? If you were to talk to
like a quantum mechanics physicist or like graph theory person, like they'd probably use I and J
too. Like everybody thinks about like vectors as like INJ.
Yeah, it is arbitrary, but it's actually okay.
Like Don's not buying it, but.
No, I think in the context that you described,
it makes sense.
If I was doing it and I was working
with like an actual collection of something
that's meaningful, right?
Like then I would probably name it that thing
then rather than I or J and then also Dawn's like I work in Scala I don't loop I just like
map over something yes exactly and I'm not gonna name it I right so we started with 80 columns
right but but like really what I'm trying to get at is the fact that like arbitrary things become ingrained and then we forget why they're ingrained.
But we keep doing them because like there's like a value to uniformity and shared understanding or something like that.
So it brings me to counting.
OK, so I'm going to I'm going to point at things.
You guys, you guys count.
I don't count out loud.
Zero.
One.
Are we starting at zero or one?
Two.
Three.
Four.
Five.
All right.
Where did you get five pens from?
You have five pens floating around on your desk?
I do.
One of these is a highlighter.
I don't know if that's helpful.
So you did start at first counting at zero.
My professor says if you're not a psycho, you start counting at one.
Yeah, because you have one thing, right?
Like you're counting the number of things.
You're not counting like an array.
Yeah, but it's super weird, right?
Is it weird that like if you hold up a pen, I'm like, that's one pen.
No, that I think is normal.
What I think is strange is that when I start counting elements like in an array
that I start counting them at zero
right because the index starts at zero
but it doesn't have to
yeah I mean if you're in Lua or MATLAB
or some old version of Visual Basic
it might start at one
but like in almost every programming language right
you start counting at zero
so that's strange right
like why is that
because computers are evil.
If you're a professional
computer programmer,
you just know, yeah,
if you want to get the index
into an array,
you start at zero, right?
And you probably don't know why.
But if you're a C programmer
or like a C++ programmer
or something,
you probably know why
there's zero base indexes
because it makes sense there.
But it has like
strange repercussions in in JavaScript. This is the weirdest example I could find. Right. So in
JavaScript, like it's now June, which is the sixth month. But if I go into my developer console and
I do like new date dot get month, what value does it return? It returns five. They count the months of the year. January is
obviously zero. And then February is one. Oh no. It feels like they just copied because,
hey, the C guy is wearing polka dots. We should wear polka dots too.
It's the same as the C people copying the Fortran people to use I for their for loop.
You expect as a programmer that things will start at zero.
That's just the way things work.
It probably has to do with memory, right?
Say that again.
So it would have to do with memory
because an array would be stored in memory.
So if you were going to be iterating through an array,
you were really just iterating through memory.
So you'd have to start at zero
because that would be the first block of memory. The short form way to communicate this is to say
that the index is an offset. So you have like a pointer to where your stuff is in memory.
And the index is actually how far you move from that. And the first element you don't want to
offset. So it's zero. But like, that's super weird that somebody in JavaScript,
where there's no pointers and no access to raw continuous memory,
that they have to know this.
Yeah, especially for like a FANG leet code interview.
Yeah.
It makes sense if you understand this history.
If you have this array like that,
and you go to write your for i for loop,
then you do for i equals zero.
You start at zero because you know that you want to get that first element with no offsets.
And then so once you've started counting at zero, you know, you start counting at zero everywhere.
And so people who worked in languages with pointers, it's obvious to them why this is the case, because they understand how memory works.
And then they move on to work in other programming languages. And of course of course you know they probably build the compilers and runtimes for
these languages and so they just carry this idea forward and so then just everybody does it this
way and because humans are pattern-based if you made a language that looks vaguely like c with
curly braces and stuff the first thing an experienced programmer are going to do is type
out for i equals zero blah blah blah blah if that doesn't work they is going to do is type out 4i equals 0, blah, blah, blah, blah, blah.
If that doesn't work, they're going to be like, this shit is broken.
What's wrong with your language, right?
Why don't you count it zero like normal people?
Meanwhile, we've all just kind of got Stockholm syndrome,
not realizing that that's not natural to somebody who doesn't understand pointers
to think that you would start counting at zero.
But people get used to new things.
I mean, like basically Dawn's not favorite company
in the whole world.
They change things all the time,
like the UVC and the lightning port or whatever stuff.
That's out of corporate greed.
They want to have a proprietary interface
so that it's proprietary
or else they would use an open standard.
Well, that's interesting. i was just thinking about like so like they use ada a lot for spacecraft stuff and especially for
those sectors you can't really radically redesign things because they're kind of specced so many
years ahead of time so when you're in the javas JavaScript slash startup world or whatever, you can do these cutting edge experimental things. But there's so many industries that use software that we don't traditionally think of as like tech that are tech adjacent, like, you know, oil mining, all that kind of stuff. And they kind of rely on convention and being conservative, like they want things to stay the same. The track record is the thing, right?
Because it's like when you evolve,
you slightly change something
only with like very specific purposes.
It's a lot safer.
Like you're not going to find
some 10x better way to program
by slightly improving on what exists.
But there's a huge risk
in changing all the things.
Like you can go into your
PR on GitHub and explain, listen, 80 lines is based on IBM punch cards. They're gone. So I'm
not using that, right? 4i, the i comes from Fortran, gone. Everything starts with 4atom.
I'm not going to count from zero. Counting from zero comes from pointers. We don't have any
pointers. So I'm going to start counting at seven. Nobody will understand what's going on. It would be a very revolutionary approach to throw a lot
of the old, some of which is bad, but like a lot of the stuff is still around because it's been
validated and it works. So this whole thing makes me think of houses. There are people who complain
that they don't build houses as well as they used to. There's one house in Peterborough called like Huntington House.
And I guarantee like if I walk by it with my friend Sam,
whose house is old, by the way, so he's not unbiased in this,
but he'll say like, oh, I bet you your house won't last as long as this house,
which is like from the 1800s.
Well, it's probably true.
But it's also the only house of that age that's still
around. There's lots of like crappy build houses from the 1800s that aren't around.
And like, who knows how much modernization renovations have been done to it over the
years. Like you've had, I can't imagine that they've got all knob and tube wiring and
galvanized pipes and like they would have had to do a whole bunch of restoration work on the inside that you don't see. Yeah. That house in Peterborough, it still exists, but lots of other
houses didn't. And there's a million things about punch card programming that they threw out,
like almost everything, almost all of it is gone. The things that are left behind out of that,
they're the things that work. The reason that the people in C kept the 80 character
limit wasn't to do with punch cards. Like originally, yeah, they had small terminals
that only supported 80 characters, but also people aren't very good at reading super wide lines.
That's why magazines have columns. The 80 started arbitrary, but the fact that it stayed means
something. If all of these people before you saw that it wasn't a problem
and decided not to throw it out, that has weight.
We threw out almost everything about how Fortran declares variables,
unless you're working in Fortran.
But we left that I because like, whatever, it makes sense.
And mathematicians already use I and you probably took a math class and had vectors.
It was arbitrary.
But the fact that we're still using it
means that it might have been a useful standard.
But I use imaginary in math.
So you're always looping through imaginary numbers.
To bring it all the way back,
the fact that some communities have kept this limit of 80 characters,
it doesn't mean that they're dumb and stuck in the punch card era, right?
It means that probably there was other benefits. The chief of which being everybody on the team had the same
width if you set up some complicated thing i have my monitor sideways and don has three columns on
his screen like we all know as long as we have room for 80 whatever weird format we want works
right well i i still think it's important to kind of push back. So my kind of worry generally
is that people get so stuck into thinking that this is the only way because nobody's kind of
pushing back against that. When people are learning programming, maybe they should have like the I
telling people like I is an integer just so that people don't get used to doing the thing where
they repeat what everybody says for its own sake, but understanding
the trade-offs and with an understanding of why certain choices are made. I would be interested
to know, like, anybody who ever made a programming language that, like, deliberately tried to push
the limits of the whole, like, 80 character limit, that would be interesting. I propose a new language in which in every file,
the limit starts at the top line.
You can be 240 characters long and it decreases by one every line.
So that the 240th line,
all you can put is a closing brace.
Before that you get two.
Anyway, but Crystal, you know,
being more of the academic bent is like, we need
to experiment more. Like, let's not be constrained here, folks. And Don and I, who probably spend a
lot more time just working on a team, we're like more concerned with like the team dynamics. And
I think that that breakdown is probably about right. You shouldn't at hand throw aside these
old traditions. You should think carefully about it. This is a weird position for me to take. I feel like I've changed in some way.
Like you're different, man. Like you guys are going to give me a talk. You've changed. We don't
like it. Mass unsubscribe. I feel like in this case, you can innovate within constraints, right?
Yeah, but you work at some organizations where they're like too square and it's just like not fun.
Like I feel like the whole point of people who became interested in software engineering is like, you know, maybe you're programming a game or like artwork or like architecture or something.
And when you kind of take all of that away and just becomes a whole like follow the rules thing, it can get pretty boring and pretty depressing pretty quickly.
There should still be that weirdness budget, a little bit of room for play.
So I used to do a lot of Scala development, but now I'm working in Go. One of Go's innovations
is Go format. Go has a command and it formats your code to a very specific standard.
And it's basically the standard for all Go code.
This is what Go code looks like.
There's no like arguments, right?
There's no tabs versus spaces.
Go format, pick tabs.
Controversial choice, but that's what it picked, right?
So there's no arguments.
I bet you there's people complaining about it though. So this idea, I think, spread to other programming communities.
Stop fighting about code layout, whatever.
Just choose something.
There's value in the uniformity.
When I was on a Scala team, putting code formatters in was a big boost
because we could stop just arguing about it.
Go, they threw their hat down, I guess, on standardization.
But they refused to tackle line length.
Line length, we can't touch it.
I think they're wrong.
Like they took a stance on tabs versus spaces, right?
But they were afraid to touch line length,
but they should have.
So Rust took a stance on this.
So Rust has Rust format
and they enforce a line length of 99 characters.
They said 99 characters, it is.
Why 99?
Because they thought 80 was too small.
If you set your IDE so it has columns of code
that are 100 wide each, you can always display 99.
And then if you have to look at a diff
with like the extra pluses and minuses,
then it still fits.
So all Rust code, unless you put a special exception
in the rule, has that line length.
It's arbitrary, but it's an idiom.
And if you read Rust code,
like you just expect things to be that long.
And people learn.
Like I feel like, you know,
when you kind of sneak in new idioms,
people over time can learn to use them.
Yeah, totally.
If you're building a database in your spare time,
make it a graph database, nevermind tables, right?
If you're building your own programming language for fun,
don't allow variables named I at all.
Just get rid of it.
Try new things, right?
Try 0.5 indexing.
Don't feel constrained by history.
But like if you're working on a team or as a group
or you want humans, you know,
in this timeline of the multiverse
to be able to understand your code,
then like you need to understand the history
and embrace the constraints.
In the midst of all this 80 character argument on lobsters, I found this guy who had a good take on
it. When a city wants to set a speed limit on a new road, they first measure the speed at which
the cars are driving, and they set the speed to the 85th percentile. Perhaps instead of throwing out
numbers, one should measure the line lengths of source code in question and set the width
appropriately. Which I think makes sense. He's saying, you know, Don, if nobody goes over 120,
set your limit 120. And then somebody else also said, I found that something like wrap between
80 and 100 when feasible and maintain a hard limit of 120 characters is a
reasonable compromise and will work well for most people. I think this is one of those personal
issues where people will never agree. I think the most important issue is not so much what is better.
That's a very personal issue, which depends on your priorities, but rather what works best for
everyone on the team and then stick with what works for the team.
I like that.
Yeah, that's a reasonable answer.
Hire that guy.
So thank you to Don and Crystal for joining me. You can find them both on Twitter
and on the Slack channel for this podcast.
I'd also like to thank all the various sources.
Thanks to user Fred Lee, I think, on Stack Overflow,
who asked this question that sparked all the discussions.
Thanks to Hillel Wayne,
whose newsletter is where I learned about the JavaScript get date.
Thank you to ARP242 and SPC246 from Lobsters.
Those names, they just roll off the tongue.
Thank you to Bjorn and Raver Bashing on Hacker News.
And I've put all the links on the webpage for the episode if you want to dig in more.
And if you've made it this far and you want more content, sign up for the monthly newsletter.
I'll be sending out something special in the newsletter soon. And you can follow me on Twitter at Adam Gordon Bell.
But most importantly, if you want me to keep producing more episodes, support me on Patreon.
If you go to corecursive.com slash supporters, it will take you there. And also let me know
on Twitter or Hacker News or wherever this episode shows up,
what you think of the 80 character limit argument. Did we resolve it? I think we did,
or at least we beat it to death, which must count for something. And until next time,
thank you so much for listening.