Embedded - 166: Sardine Tornado
Episode Date: August 31, 2016Bob Apthorpe (@arclight) spoke with us about software, nuclear engineering, and improv. Bob is giving away three books! Send in your guess by October 1, 2016. One entry per person. (More info below.) ...Hackaday SuperCon is Nov 5-6, in Pasadena, CA. Bob's long languishing blog is overscope.cynistar.net. Peep (The Network Aualizer): Monitoring Your Network with Sound Safety-I and Safety-II: The Past and Future of Safety Management Now! The books you may win! Atomic Accidents by James Mahaffrey, someone who knows the technology and history and does a fantastic job explaining complex failures in an engaging way without resorting to fear-mongering and hyperbole. (Guess Elecia's number for this one.) Safeware by Nancy Leveson, may be 20 years old, it's still full of amazing insights for delivering safe, reliable systems and ways of looking at the organizational contexts in which these systems are built and used. Even if you aren't developing safety-critical systems, it's a fantastic resource and really thought-provoking. (Guess Christopher's number for this one.) Every Anxious Wave by Mo Daviau is a novel about rock & roll, time travel, love, loss, and finding things you didn't know you were looking for. Full disclosure: The author is Bob's ex-wife. (Guess Bob's number for this one.) Â
Transcript
Discussion (0)
Welcome to Embedded.
I'm Elysia White, here with Christopher White.
This week, our guest describes himself as a nuclear engineer, carver of Fortran and Python,
somewhat ex-SysAdmin and procoder, law tech wrangler, and invertebrate science geek,
which is different than inveterate science geek
he promises never to program websites ever again our guest is bob appthorpe howdy now before we
get really started and talking to you i do have to mention that hackaday is having another super con
remember i did a show last year where i talked to all the organizers and had a good time
it's going to be in pasadena this year november oh it's going to be in los angeles this year Supercon. Remember I did a show last year where I talked to all the organizers and had a good time.
It's going to be in Pasadena this year, November. Oh, it's going to be in Los Angeles this year,
Southern California, you know, they're all the same. November 4th through 6th. And I was going to tell you the call for proposals is open, but I think it closes September 1st. So if you're not
listening to this right away, it's over. But you can still go.
It'll be cool.
Hi, Bob.
Thanks for being here today.
Hey, thanks for having me.
So I gave you a little bit of an introduction, but maybe you can tell us more about yourself.
Yeah.
So I had kind of an odd career path.
I got a bachelor's and master's in nuclear engineering um which basically means i
know how to boil water and shuffle paper and scare people at cocktail parties i'm not sure people
actually have cocktail parties anymore but um if they do i'm i'm okay at scaring people not
intentionally just you know by virtue of being there um maybe that's why i don't get any
invitations anymore anyway that's that's beside
the point really um but uh after graduation i went down to louisiana and worked at a power plant for
there for a while and i realized i was not cut out for the power plant and louisiana lifestyle
and i moved to texas and turned into a a web programmer and sysadmin during the dot-com boom and bust.
And then I found myself back in the nuclear industry
where I am kind of working at what was my dream job back in 1990
doing severe accident work and environmental remediation type of stuff. Basically, anything that can explode with hydrogen that isn't a zeppelin,
I'm probably involved with.
What is the thing you wear to determine if you've had too many...
Rads.
Rads?
Oh, the...
Dosimeter?
Yeah, the dosimeter.
Old school, you know, some people will call them a film badge back when they used to have like photographic film in them.
It's, let's see how the technology goes on that.
It was like a piece of photographic film because that's how they discovered radioactivity in the first place and then it went to something like a calcium
fluoride or something chip which would soak up radiation and then you could heat it up and it
would give off light later so you could kind of see how much radiation people picked up but um
i'm not sure what they're using now if they're still using that or if um you know like there's
a lot of the tiny little ones that uh like the
tiny like geiger counters that you know are wearable um there's also some really interesting
silicon uh or uh semiconductor counters out there as well you've you've managed to derail your own
show in six seconds sorry no not you i've been to her he's over here mouthing to me, like, why did you ask?
And my question was really going to be, do you wear a dissimilar often?
Yeah.
No, no, no, no, no.
I work in an office.
I pilot a desk.
They don't let me near the interesting stuff.
Oh, well, then I would think cocktail parties wouldn't be as big a deal.
It's the people who are actively glowing that you don't want to invite.
No, so those people are outdoors
on the back porch with the fire dancers
and they're part of the show.
Those people are invited.
It's the kind of people who spend their time
in the eerie glow of the computer
and don't really have social conversation.
Those are the ones, you know,
it's like they're less scared of your glow
and more scared of your brain.
It's a different kind of glow.
Exactly.
Okay.
I know you listen to the show sometimes, so you are familiar with lightning round and I'm not even going to explain.
Good.
Fire away.
Favorite subatomic particle?
Neutron, because it changes things.
Science, technology, engineering, or math?
Um, oh, man man blindsided already um i'm you know
i'm gonna go with math beat your mountains um mountains favorite isotope That's a... Probably deuterium.
Heavy hydrogen.
Favorite fictional robot.
Gort.
Worst version control system you've ever used.
None.
Second to none. Second to none would probably be...
Okay, I would say SCCS, except I'm going to go with RCS because I didn't know it was RCS when I accidentally used it.
Because of an unfortunate human factors problem, the C and the V keys are right next to each other on the keyboard. So if you go to type VI to edit something,
and you accidentally type in CI instead,
your source file disappears,
and it's replaced with something.
I think I've done that.
I think I remember that too.
And you're like, what did you do?
Where did my file go?
And then you're like, but no, no.
And then the irony of it is is it's put it into version
control where it's supposed to be safe now i'm sure i did this yes i do recall this
uh oh i'm not i'm next time um favorite invertebrate favorite invertebrate? You know what?
I like the Monterey Bay Aquarium with the jellyfish exhibit.
I love the jellyfish.
They expanded the jellyfish exhibit to be a tentacles exhibit, and now they have all sorts of cephalopods as well.
Oh, nice.
So, cuttlefish and octopopi and it's just fantastic i also like the sardine tornado
because you just go in that room and then you just start like spinning a little bit and then
it's like whoa and then you know it's like sardines revenge so yeah the monterey bay aquarium
as a whole is pretty good but you don't have a particular favorite variety? Stubby squid?
You know, I would, I have to say, I like, you know, just octopus in general.
Because I stopped eating octopus once I found out they played with toys.
I could not ethically eat an animal that got attached to a Mr. Potato Head.
There's a new book called The Soul of the Octopus Out, and I'm totally looking forward to it.
Because the last time I read about octopi intelligence, it was fantastic.
Yeah, you have to keep them engaged or they get bored and they get cranky.
Sounds like Christopher.
That sounds like me.
What science fiction technology or concept do you think will be real in our lifetimes
yeah you know at this point i'm gonna have to say the the 1950s era rockets that just
go up and then come back down and land i mean i know that's that's not nearly as cool as the
singularity or you know or the singularity or, you know, or-
I'm not sure the singularity is cool.
I'm not sure shooting the Falcon rocket is fair because it exists.
I mean, well, yeah, but it's, yeah, that's true.
I mean, that's, well, I mean, it is cheating.
Okay, so that's cheating.
Well, I haven't done it with people in it, so that's fair.
I don't know.
I mean, we've already done pretty much every kind of dystopia.
With the possible exception of gray goo.
And, you know, I think that's only because nobody looked in my refrigerator when I moved out of my last apartment.
Well, and CRISPR's a thing, so gray goo's coming.
Okay, yeah.
All right.
Last question, which i did not write
how far away from a nuclear reactor should i live to be safe in the event of a disaster
um well if it's provided it's not human error if it's bad enough to break the reactor
um you kind of want to be inland and uphill.
Yeah, there's...
Away from the prevailing winds.
Yeah, pretty much away from the prevailing winds.
But you'll probably have a couple of days to sort out what you're going to do if if we you know if we note that anything's you know going south
you know these you know we the last reactors that tended to you know blow up without warning were
kind of like the the russian rbmk is the chernobyl style reactors um light water reactors which are
the ones that are out uh you know the the ones that are in primary use today take hours or days to get to a point where, you know, they're really sad.
I mean, you look at Fukushima and that, you know, that took days to evolve.
You know, the issue is you may want to just stay at home and keep your windows closed and then, like, avoid the roads because the roads will be more dangerous to you than the radiation will.
All right.
So advice given, but what does a nuclear engineer do?
Or more specifically, what do you do?
What do I do?
I mean, to kind of look at the general thing, nuclear's nuclear engineering is kind of it's one of those
late model engineering degrees and it turned out to be sort of a jack-of-all-trades um the the
background you know my background has been in a lot of uh thermodynamics and heat transfer a little
bit of chemical engineering um a lot of physics a lot of math math, and then, you know, a small amount of, you know, just the
nuclear physics of like, well, you know, how do we turn this into that? And like, you know, how does
shielding work? And, you know, how do you, you know, how do you build something critical? How
do you keep it critical? How do you make it not critical? But my day-to-day job, I work on accident analysis and waste technology and software development.
So it's a really interesting, you know, my position is pretty interesting because it's a, you never know what you're going to be facing from week to week.
A lot of the specific stuff I work on is hydrogen control.
And hydrogen shows up from a lot of different sources,
either from, like in the case of a severe reactor accident
where you have steam and zirconium cladding reacting to form zirconium
oxide and hydrogen and heat, now you have explosive gas and extra energy and your fuel
has all fallen apart. But there are other conditions where, say, you have radiation in water.
You wouldn't think that water is that complex, but once you start actually looking at what, you know, all the chemical reactions of water coming apart and coming back together again, you know, you put some radiation into that and you're going to knock hydrogen loose.
So you may have a pool full of spent fuel or, let's say, an ion exchange column that does water cleanup.
And you've pulled all this radioactive material out of the water, and now it's stuck in these resin beds.
Essentially, it's a big filter.
So you've pulled all this radioactive stuff out of the water into the filter.
That's great. But now it's in the filter and it's still irradiating the water.
And is that going to make hydrogen gas?
And where is that going to go?
And hydrogen, I mean, as we've seen with people trying to have hydrogen-powered vehicles,
hydrogen is a pain in the rear to deal with um because it's a it's a it's a sneaky little molecule it it it squeezes out of places it's tiny i believe it's the smallest one it is yes
it's so tiny and it just sneaks past things it diffuses through things. It causes like metal embrittlement.
It's just,
it's irritating.
It's the perfect fuel.
Exactly.
It's a perfect fuel.
It'll,
you know,
because you can fill up your,
you know,
fill your car with it and then wait 24 hours.
And then your gas cap is like embrittled and then all your hydrogen is leaked
out and then you have to buy more.
It's awesome.
And it's not a greenhouse gas.
So,
you know, it's, it's the second most you know common element in the universe outside of stupidity so um so you're not in the pocket of big hydrogen no i i am in the hot i'm in the pocket of small
hydrogen you know it's the it's like okay so how do we deal with it when you know when it gets out so
you know it'll show up from you know either chemical reactions or radiation or you know
all sorts of things so that's that's one of the kind of generic issues that shows up in like
you know wastes uh you know waste and severe accident type of stuff. So, you know, it's like...
But you talk a lot on your Twitter account,
which is how I know you,
about software.
Yes.
How does that come into this?
Okay, so, well,
we're not allowed to touch anything anymore.
So we have to do everything with modeling.
There's a lot of stuff that we don't want to try and simulate.
I mean, the nice place about, you know, the nice thing about where I'm working now is we do a fair bit of experimentation,
but we also do a lot of basic math and engineering work, but we also do software development.
So we would like to know, say, for example, when your reactor runs out of water, how long is it going to take to fall apart?
What's going to happen when – will this melt?
If it does melt, how much melts?
If it keeps melting, is it going to stay at high pressure and then, you know, squirt molten fuel all over the floor?
Or is the bottom just going to fall out like a wet grocery bag and drop all this stuff onto the floor?
And then, you know, it's going to eat into the concrete and, you know, all that.
And that sounds really flippant and all that.
But, you know, because there's so much bizarre phenomena going on.
But, you know, that's one aspect, you know, because there's so much bizarre phenomena going on. But, you know, that's one aspect. So I work on software simulations of severe for predicting what the shape of the hole in the floor was going to look
like as things ate into the concrete and also what happened when you basically flush molten
core debris down a drain so you know is that recommended i don't think that's recommended
it's not no like baby turtles no well you shouldn't think that's recommended. It's not, no. It's not like baby turtles.
No.
Well, you shouldn't do that either.
No, you really shouldn't.
No.
But it's actually more, it's more like dumping grease down, like a, you know, if you made a big pot of bratwurst and you have all this grease left over and you just toss that down your sink trap, you know, it's, it, it, while it's hot and in the pipe that's fine because it'll keep flowing but it doesn't stay hot for very long it doesn't stay hot for
very long and it starts to coat the inside of the pipe so the question is and this this is where
you know this is why we were we look into things like this um when you have this molten stuff
flowing through a tube it's going to start
solidifying against the wall of the tube and making the the tube smaller and the question is
will the tube freeze shut before whatever molten goo you have upstream transfers to the downstream
place and the reason people were asking this question is not because they're really concerned about the pipe but what they want to know is since they can't just go into
like the fukushima reactors and see where the molten fuel debris went they'd like to be able
to predict is is it all underneath the reactor or is it you know down this some you know is some of it has some of it flowed through
this sump piping over into a different part of the reactor and they don't want to be surprised
by it they want to know what they're they're up against when they're cleaning this up before they
actually go in so that's that's kind of what we try to do and that's i mean that's one of the
hard things about writing this kind of software is that there have been so few failures that there's not a lot of real-world data.
I mean, we're kind of a victim of our own success, and I'll be the first to admit that I'm happy about not having real-world data to work off of.
But a lot of this is based on experimentation and there's a
lot of uncertainty into it you know in this so you know you're never i don't know it's it's it's
interesting but you know in some in some in some sense it's kind of like video game physics i mean
does it look kind of right it's like well yeah it's like okay well, yeah. It's like, okay, well. Until you actually go and look and find out whether or not you were right, all you can do is guess and hope.
And smaller models and bigger models, yes.
Yeah, a lot of what we do is we'll take known problems or known two years we've been working with a lot of
fukushima data to try and um benchmark the new models in our in our software against what what
they saw and the problem with that is that you know when you're you know you're dealing with
kind of an incomplete data set because that you know that wasn't that reactor was not set up as
an experiment and it was not operating with like all of its monitoring system it that you know that wasn't that reactor was not set up as an experiment
and it was not operating with like all of its monitoring system it was you know was it had
you know they were doing everything they could to keep the thing together so taking data was not
their primary concern you know it's like we you know this is you know it's this isn't like you
know i don't know what is it is it an independence? You know, it's like, let's talk to the aliens. Like, they're actively trying to kill us. Just kill them. It's like, no, we have to understand them. It's like, no, hit him with the shovel. We'll do the science later. simulation to do what was seen now when you run the simulation again it works but you may not be
sure that it will work next time because there were variables you didn't get to see and it's
and you know you're dealing with such low probability events and you know it if you the the closer you simulate the actual conditions the
the more uncomfortable everybody gets with your testing because now you're actually you're not
just using like um simulated materials like uh one of the projects i worked on was for the Hanford defense site in Washington state.
And they've got an old spent fuel pool from the defense reactors out there where it's a couple hundred yards from the Columbia River and it's 50 years beyond its design life.
So, you know, they have to clean this this pool out and so they're doing a lot of
testing it's like okay can we you know how do we move the sludge around and stuff you know you you
can't you can't do a perfect simulation because if you're doing a perfect simulation now you're
actually using you know radioactive materials and nobody wants to do that so you you know you kind
of you know everything is going to be you know, everything is going to be, you know,
you're going to be dealing with the approximations, you know,
based on the kind of modeling and the kind of experimentation that you can do.
And it, you know, comes back to like, you know, how much do you,
how much are you actually going to have faith in the model
and how far are you going to push it?
Do you just want to see kind of a ballpark where it's going to go
and then you're going to apply some sort of conservative design
when it comes to actually working with material?
Because you don't want to get in a point where you believe the model,
garbage in, garbage out.
You don't want to treat the model as gospel.
It's just a model.
Have you ever had a software bug that indicated the world was about to end?
No, not.
Close enough, it sounds like.
No, but I mean, no, but so the weird thing is when people have found serious, like, I won't say serious bugs in terms of giving thoroughly wrong answers in the code, but more of, like, say, a module was coded wrong and, you know, you compare it to the actual math or the engineering that
the software design, you're like, okay, this doesn't implement the design correctly, or we've
put a completely wrong constant in here. And you realize that it doesn't matter.
It's like you've gone through all this effort to model a phenomena. And at the end of the day,
you run your simulation, it doesn't make any difference.
Because that wasn't what was driving yeah it wasn't a it wasn't a critical you know it wasn't a critical parameter when you talked about your schooling and training you didn't
mention um software engineering or computer science or software development. Do you have formal training in that?
Sort of.
So I took this weird career arc.
You know, it's like you sit down in your first day of orientation. They ask you to pick a major.
And at age 18, you're not really capable of making that decision intelligently.
But I kind of flipped a coin.
It's like, well, I was good at chemistry and good at computers.
And I like both of them pretty equally.
But I'm going to go with chemistry because I don't want to be tied to a terminal for the rest of my life.
How's that working out for you?
Yeah, see, that's one of the things where I look back,
it's like, this is possibly the dumbest thing I have ever said.
Well, to be fair to yourself, I mean,
you couldn't have predicted that everyone would be tied to a terminal
no matter what they do anymore.
Okay, fair enough.
Not the people who actually need the decimeters.
But, you know, what ended up happening was that i did pretty much
everything but get a computer science degree when i was in college so i went the engineering route
i went from you know chemistry and it's like okay i don't know what a chemist does long term so i'll
go into chemical engineering and it's like well i don't know what chemical engineers do either
but this is getting really hard my My grade point is plummeting.
I'll go over to nuclear because that looks interesting.
And they'll let me.
They'll take anyone.
That is not a fair assessment.
I'm going to strike that.
Yeah.
Yeah, well, guess who's rejecting my donation?
Nope.
Keep your money, um but uh no so it um but
while you know while i was pursuing the engineering side i was also taking classes in machine language
and architecture and you know uh mathematics classes in combinatorics and um you know i just doing a lot of i mean not the formal computer science
curriculum but i always had kind of one foot in that space and as my career has progressed you
know i went and did um a risk assessment at a power plant.
And then, you know, in the mid-90s,
the way I phrase it is,
I couldn't stand being at work and I couldn't stand being at home.
So I moved to Texas and worked for a game company.
And then I ended up in the dot-com universe.
So, you know, I justcom uh universe so you know i just i stayed you know it's like i basically kept doing the computer stuff that i'd been doing all all along um you
know and and you know kind of the the engineering side turned into you know kind of put on a back
burner and you know i kept the understanding of reliability analysis
and risk assessment um you know because that's you know completely applicable to system
administration or you know or software development you know you you know maybe you're not you know
hauling out your book on thermodynamics a lot but you you are looking at, you know, okay, well, what's a conservative design principle and, you know, how do we, how do we, you know, predict where things
are going to be soft and, and, and, you know, code defensively. But I ended up spending a lot of time
looking into the history of computing and trying to back work knowledge of databases and operating systems, networking.
And then looking into the more esoteric ends or more professional ends of software development.
Things like literate programming or is this the right language I should be using for this project?
What other tools do I have available to me?
And then things like risk assessment comes into play
when you start dealing with version control systems
because it's like, okay, well, how are we going to safeguard our code?
So that becomes second nature to anybody who does software for a living.
Maybe.
Well, it doesn't, it doesn't.
I mean, I was just thinking as you were talking about, you know, programming for different industries and that whole model of risk assessment and doing, you know, formal risk mitigation analysis and that kind of thing, that only exists in a couple
of places. And everywhere else, it's kind of either implicit, like, okay,
well, we're going to use a version control system and our IT people will be responsible for
safeguarding our code. But there's not really
any other attention paid to the kinds of risks that maybe could be
mitigated if they were more explicit. Well, I think if you come from that background, from the kinds of risks that maybe could be mitigated if they were more explicit well i think if you come from that background from the idea of mitigating risks
and considering them um you apply that wherever you are yeah and that's i mean that's the the
thing about it and the interesting thing is that wasn't exactly part of the curriculum
in in school i mean there's only so much you can teach in in the
somewhere between four and eight years you're in college um you know but a lot of it it came
from being just on the job in a in what they call you know a high performance organization
um you know when you know when you're just not allowed to have the same level of failures that
are tolerated in other organizations.
And especially when you've had high-profile accidents and then they've spent a lot of time and effort looking at how did this come about and how do we prevent this from happening again and not you know trying to prevent just one sort of accident from happening
happening again but like looking at contributing factors you know not just hardware but things like
you know organization or you know a lot of human factors um you know things like i worry about
diverting into a long spiel on this one but like like the, with the Three Mile Island accident is a really interesting
example.
And I,
I,
I,
I draw a parallel between Three Mile Island
and compiler warnings,
which you would not necessarily put in the same sentence.
One of the contributing factors to Three Mile Island was
they have these, what are called enunciator panels.
It's just a big array of lights.
And when something happens in the plant, a light will come on.
Now, that could be anything from like your tank is low and doesn't have enough water in it to your core is overheating and you need to shut down right now. So there's sort of a wide range of potential,
the severity of conditions that will cause a light to come on.
And there's hundreds of these lights along the top of the control panel.
And one of the problems at Three Mile Island was a lot of those lights along the top of the um the control panel and one of the problems at
three mile island was a lot of those lights were on and they didn't mean they they weren't
necessarily critical but they were always on and the problem um was was identified that
if you had an issue where something broke that really was important
one light would come on and you would have to identify that out of the 100 or so other lights
that are already on which you know if we go back to the notion of like compiler warnings it's like
well most of them don't matter it's like oh you're casting from this you know uh you know
assigned to unsigned integer or something like that.
And it's,
you're never going to use a negative value for there.
So it doesn't,
it doesn't matter any.
Um,
but you know,
if you've got 500 compiler warnings and one of them is actually significant,
how are you going to see it?
Right.
In all of that.
So,
so there was a strong emphasis, you know, of all the lessons learned after three mile? So there was a strong emphasis, you know,
of all the lessons learned after Three Mile Island,
there was a strong emphasis.
It's like, look, if those lights aren't important,
turn them off.
You know, either come up with a justification
why you don't need that light
or go fix the actual underlying problem
that makes that light be on.
And again, it's a pain in the butt,
but that's, you know,
that's the cost to doing business in that space.
So if you're, you know, same thing, you know, if you're going to be a professional programmer, it's like, turn off your compiler warnings.
I mean, not turn them off.
Turn them all the way up.
It's like you turn them on to pedantic, and then you address them piece by piece.
And you make them go away the way an adult responsible programmer would do.
And that's hard to do when you're faced with schedule and budget issues
and maybe a culture that doesn't respect that kind of,
it's like, oh, that's just pedantic BS.
Why are we wasting our time with this?
It's like, well, it's a practice of good engineering.
I do not understand how vendors let out code
that has hundreds of warnings.
I'm working on a project now
with the Freescale Kinetis line
and it pisses me off
that they have so many warnings
in their example code.
You know,
somewhere in there
is probably an uninitialized variable
and it's going to take me weeks to find it because I have to deal with the other 500.
And then when they release a new set, am I going to redo it or am I just never going to update?
Yeah.
But, you know, it's funny you talk about this because it's a problem that's very prevalent in medical devices right now.
That they set the medical devices to beep or warn pretty
early thinking well you wouldn't want it to get beyond that that would be dangerous and then we
end up with all of these beeping warning devices that nobody can the nurses can't pay attention to
that many beeps and that's and that's the worst part about that is that's a problem that was recognized as, you know, as early as like the Vietnam War where they had, you know, all these, you know, the beeping and blinking of the avionics systems.
You know, it's like, okay, well, you know, let's alert, you know, let's alert pilots if, you know, if they're getting, you know,ed by radar and and if they see this and that
and so there would be so much noise in the cockpit that that pilots would turn off start turning off
all the warnings because they couldn't think because there was every you know it's like the
same problem with every monitoring system out there we'll just enunciate to everybody and you know and then you're over so overwhelmed with
signal well it all becomes noise yeah i mean it's i mean it's it's all this is all coming down to
signal to noise ratio and it's like when it's once you get to that critical level of noise you start
turning everything off and just ignoring it or you accept it as normal and you know well or have
systems which are intelligent enough to synthesize
some of those individual warnings and say ah this condition is severe and then have it say to you
pay attention now this is a severe condition rather than well these five things are happening
so i don't know if you make sense of it or not but you know here are the five things
well there was a i don't remember the name of the software, but I remember at the Lisa conference one year, they were talking about something that would go through the syslog and it would convert syslog entries to essentially like a musical tone or a chirp. I don't even I don't remember the name of it um I'm sure this can show up in the show notes I
will take that on as an action item for later but um it would you it would convert these to
you know just a noise um you know like a chirp or a beep or something and then you would be able
to listen to your syslog as if it was like a jungle yeah yeah so it's you know it's like you
hear the normal chirping of birds and then there's this squawk you're like that's not right and that
you know it's like try to you know use different senses to try to to alert people to um you know
significant events because you know you know we we don't have to do everything through
parsing text.
I mean, I love text, but when you're
faced with a wall of text, maybe you need to find a different way of
doing your pattern recognition.
But this problem of filtering out warnings
and making sure we get errors,
it seems like many industries have to rediscover
this same really critical piece of design.
How do we cross boundaries?
How do we get this from avionics,
which is how I first heard that, no, nothing goes wrong unless the plane is about to fall out of the sky.
Everything else is unimportant.
Okay.
You know, for me, I've kind of taken a two-pronged approach um the last decade or not decade the last five years i've spent a
lot of time looking at how other industries um handle the same sort of issues i mean look at
looking at things like food safety uh looking at um aircraft you know how does how do um
you know how do airlines or the faa regulate um their sort of systems the way you know, how do airlines or the FAA regulate their sort of systems the way,
you know, because we, you know, people in the nuclear industry,
you know, we have the same sort of issues of, you know, we can't accept the same level of failure as,
you know, having to reboot your coffee maker.
I mean, in the same way that you can't, you know, have a satellite where somebody has to go and turn it off and turn it back on again or an airplane, turn it back off and turn it on again.
So, you know, you look at other high performance organizations and find out what kind of challenges they face or even not, you know, super high tech ones like food safety. Like how do you do surveillance to make sure that, you know,
food's safe and how do you do the monitoring on that? How do you, you know,
how do you handle epidemiology? You know, just, you know, look at looking out,
you know, you have to take, you know,
you have to intentionally look outside of your own industry.
And it's really hard when you have a normal life and whatever other job responsibilities you have.
The other thing is, if you have changed industries, like when I went from nuclear to sysadmin, I didn't leave all of that safety and reliability thinking behind.
So when I was in high availability systems classes, I'd be in a workshop with some people and they're talking about single points of failure.
And meantime, between failure, I'm looking at it, I'm like, that's 1940s or 1950s era reliability analysis thinking.
That's ignoring so much of
you know interfacing systems and systemic effects you know things like common common cause failures
like you remember a couple years ago when they there was some company making capacitors and
somebody else had stolen their electrolyte formula but badly so there were a number of counterfeit capacitors
that went out on the market and they started causing board failures all over the place
i don't know if you yeah remember that if you remember that um i mean i i had a i think i had
a motherboard go out from that the the capacitors would you know sausage right up and you know you
lose your board but it's like that's that's what's called a common cause failure. It's something that affects a bunch of, it's not endemic to the design. have in common by virtue of the fact that they came from the same manufacturer or they bought the same lot number of capacitor from different manufacturers,
things like that.
And that's really kind of confounding when it comes to doing quantitative risk assessment
when you're kind of like saying, okay, well, I've got diverse redundant systems.
I've got one DNS server from one vendor and one from another
vendor and they're plugged into separate switches
and they're in separate power supplies
and it's like, okay, yeah, but they all bought
capacitors from
the same bad vendor.
So it's the same
issue that can take them all out.
So it's like even if you do go
to the point of that
trying to design
your system to be very resilient you're still you know you still have to worry about getting
sandbagged by something like that and then they have the problem that maybe they get a chemical
lot and it goes to different um manufacturers of capacitors and so you aren't it's very hard to battle this all the way through
right so that really all you can do is is do your best with like you know you know trying to design
some resilience into your system and and presume your your stuff is going to fail and be able to
deal with you know deal with it gracefully um and you know some you know that deal with it gracefully. And, you know, some, you know,
that's easier said than done.
It often is because the way that we deal with failure tends to be, oh my God, it failed.
And then there's somebody on the phone
and it's like, why isn't it back up?
Why isn't it back up?
And they're calling every three minutes
and you're trying to solve the problem.
And, you know, eventually you just pick up the phone
and say, yeah, we know there's a problem
and we don't care.
We're lying on the floor in a pool of bourbon.
Or after the problem, tons of corrective measures are made that focus on that one problem to the exclusion of a bunch of other things forever.
Right.
And there's, you know, there's thinking that goes beyond, you know, like, well, what went wrong?
And let's do a root cause analysis and let's, know close all the barn doors for all the horses it's like well have you ever considered that like we you know we don't
ever pay attention to what we get right and how we you know how the resiliency of you know the human
you know the human aspect of it i mean we always consider the very variation of performance to be a bad thing and we we would like to replace everyone with robots so that you know they all work perfectly and it's
like well but maybe we what we want is to instead of controlling that variable variability of
performance we want to like steer that in a direction where it's like well we want we want
performance variability but we want it to be good.
So, I mean, this is kind of more philosophical stuff.
There's a guy named, I think, Eric Holnagle,
and he talks about safety one and safety two.
He's got a slim book where he talks about the distinction
between kind of this old school and a newer way of looking at this stuff.
But that's another thing. It's like kind of looking old school and, you know, a newer way of looking at this stuff. But, you know, that's another thing.
It's like, you know, kind of looking into the safety literature
and looking at, you know, other accidents in other industries,
you know, train accidents or aircraft accidents,
the process industries or chemical plants, you know,
looking at the history of spaceflight, you know,
it's hard to get
outside of your comfort zone. And I know, especially for the field that I'm in, it's like,
you know, we, nobody talks to us and we don't talk to anybody else. So it's, you know, it's
really hard to cross pollinate or get, you know, get new information in or share it with other people you really have to make a concerted effort to do
that well that's often difficult because your industry is sort of known for being a little
stagnant oh yeah oh that's oh yeah yeah i told you about the issue of manufacturers of parts.
The manufacturers go away, and the parts have lasted so long that there's no one to build replacement parts, and these plants are running a lot longer than anyone expected them to.
And a lot of it turns into a victim of your own success.
It's like, well, if you make something ind indestructible you only need to make one of it yeah if you make a part that's supposed to last for 50 years who's going to make
the next one right you know it's like well it hasn't failed um well i guess we're out of business
now we've sold our three yeah exactly um one of the things you've been mentioning is engineering and software design.
And there's this thing called programming, which to me is probably the skill.
It's the typing, where the software design and the engineering are more the intellectual, artistic creation parts.
But that's sort of me.
What do you see as the difference?
This is an appropriate part where I'm taken to the woodshed over my comments about software and engineering.
I think software is more of an art or a craft.
There's a limit to what you can do from sort of the mathematical basis.
And I think engineering has a lot of art to it as well but the there's a lot more um physical forces you
can rely on you know there's just sort of a lot more axiomatic things you can rely on in say the
traditional or harder engineering like um you know gravity is going to pull in a certain direction
i mean unless you're in outer space but i mean by and large you know you you is going to pull in a certain direction. I mean, unless you're in outer space,
but I mean, by and large, you know, you can rely on gravity. You can rely on things like pressure.
You know that energy is going to go from hot to cold until everything's kind of lukewarm.
You know, there's all those things that you can depend on. I mean, you can't push a rope, you know, is always a good engineering adage. And with software being so abstract, I think it's really hard to do, you know, it's really hard to, I think, treat software in sort of an engineering kind of way. And I think this is complicated when you're dealing with,
with,
with hardware,
because then,
then you're actually dealing with,
you know,
more,
um,
you know,
more concrete,
you know,
more concrete things like you can't split a pixel.
Um,
despite certain people trying to tell graphic designers to do just that.
You,
I mean, that's, you know, that's, you just can't do that.
I hate my brain is sitting here going, but you could do there.
You could go back and forth.
It's sort of like, and that's totally a software perspective on it.
Oh, I can explain.
Subpixel rendering with anti-aliasing.
Of course you can do a quarter of a pixel.
Sure.
Sure.
You know, and we can, you know, we can lay layer a bunch of abstractions on there, but when it can do a quarter of a pixel. Sure, and we can layer
a bunch of abstractions on there, but
when push comes to shove, you're going to have to
send an electrical signal down the line
to the monitor
to address that pixel, and you're going to have to tell
it what its intensity
is.
At some point,
there is rubber and a road and a point where they meet.
Whereas, you know, it's like, well, let's just slather another layer of abstraction on there.
It's like, well, I mean, you know, Fortran had a really interesting bug.
I mean, the language itself, not, actually, I'm not sure if it was the language or specific implementations but since everything was done by reference and not by value you could in some cases inadvertently
redefine constants like numerical constants so you could say if your compiler was not particularly smart, you could say like zero equals one.
And then from then on, zero was equal to one.
And this would cause a great deal of consternation.
Seems fine.
Yeah.
Do that all the time.
Sure.
That's the great thing about standards.
It's like, well, we'll just redefine the standard to dark.
But engineering, and I know we've got some canadian listeners that every time we say
software engineering they're like no no engineering you can't use that word and and believe me as a
long time former resident of texas i mean even me with an engineering degree could not call myself an engineer without having professional
engineering licensure um you know i would need a pe license to legally call myself an engineer in
in texas which is a series of exams as well as some overseen work experience it's it's it's the
basic professional uh you basic professional certification, like
medicine or law,
where you have some
ongoing, you know, you have a code of ethics,
and you've got some commitment
to continuing education.
You know, that's,
I don't know that there's a
similar thing
in the software industry other than just basic career survival that you have to stay.
It's like you don't have an option of continuing software education.
You're either continually learning or you're, I don't know.
About to be fired.
Yeah, exactly.
And there's no ethical oversight no ethical training but you're
already bound by i mean it all right i'm not going to get into this but you know you're already bound
if you're working in a regulated industry you're already bound by those regulations so if you're
working on a medical device you already have to follow the fda process so ethics on top of that
seems well i can see where that would be that would be it's all imposed ethics
you might think it would be superfluous but i mean there's things like you know
conflict of interest or i mean just you know in basic i mean there's some weird aspects of trade in there.
I mean, as someone who's slowly studying for PE licensure, I'm familiar with this.
I think when I was a sysadmin, we had been pushing for some sysadmin kind of certification.
And again, this was also like code of ethics and um continuing
continuing education type of stuff but uh there is in fact i was really surprised to find this
that there is a pe exam section like you can you can go in for a pe license in software engineering.
I'm still somewhat confused by this,
but I sprung the 40 or 50 bucks for the study guide for the software engineering PE exam.
And it's kind of interesting.
I think it's more focused on requirements management
and the whole C cim cmm uh capability maturity
model kind of way of looking at things um but it's you know it's not really i mean that's more
managerial and organizational than um you know from a practitioner standpoint well and IEEE tried to have a software license
or certificate and it it was pretty cool except it was a lot more manager than than practice
so but and I don't know that it went anywhere other than at them advertising to themselves
yeah and I think that was sort of a problem when uh when we tried
to do that um within the sysadmin community because you know again there wasn't there wasn't
an actual i won't say there wasn't a need for it it was just that there wasn't as strong of a need
for it as we maybe would have liked. And even with normal mechanical or electrical
or chemical engineering,
you can easily spend your career without a PE license.
You don't need one.
It's just the times when you do need one,
it's because you're working on something
that is particularly critical to health and safety.
Like, you know, like you're mentioning, like medical devices or, you know, for example,
like civil or structural, like structural engineering, you know, that's primarily where
you're going to see, you know, you're going to see a lot of PEs in that area just because you don't want buildings to fall down and crush people.
Or like in the case of nuclear engineering, looking at high pressure steam piping, you know, all of that has to be done up to some pretty strict codes. So there's a lot of, you know,
there's a lot of oversight on that
where you have to, you know, sign off,
you know, review and sign off on documentation
to make sure that your, you know,
that your steam system doesn't, you know,
explode and kill a bunch of people.
And, you know, and actually,
if you look at the history of the development
of high pressure steam, you know,
and, you know, that's, a it's a every time you see these
regulations you you start realizing that they're pretty much written in blood somebody had to die
before some you know before these safety regulations went into place you know for for
railway or or you know or fda or automotive aviation exactly all things, they didn't write them to be annoying.
Yeah.
It's not like somebody's in a committee just trying to make life miserable for people by saying,
make sure that the guy who routes your natural gas to your boiler is actually a competent engineer i mean that you know the
whole reason that natural gas even has a smell um because that's added natural gas by itself
has you know is odorless they added a smell to natural gas so people would know that there was
a gas leak because of a pretty horrible accident in new London, Texas in the 30s,
where they took some gas from an oil field and piped it into a high school,
and they ended up blowing up the high school and like 300 kids.
And it's just, it's horrible.
And I mean, that's the whole reason you can't call yourself an engineer in the state of Texas
without having a PE license.
So, I mean, not to get all heavy and morbid on this, but it's, you know,
you look at the history of this stuff and, you know, you realize that you got to,
you know, there's a pretty heavy weight on your shoulders and you have to take
this kind of stuff seriously.
And you would hope that your management and that your coworkers would kind of,
like, you know, go along with you on the ride.
It's like, it's this is
inconvenient and it maybe adds cost or delays schedule or something but at the end of the day
you know you know you're doing the right thing yeah but it's got to be a team effort it it's
really tough to have that from just above absolutely and it is equally tough to get the
training and to want to talk to people about it only
to have everyone say, no, that would cost too much and not consider it.
So, it's really finding the right team in the right industry.
And also, it shouldn't necessarily be an either or.
You're either in a regulated, safety-conscious industry or you're not.
I mean, I think there's a spectrum of you don't necessarily have to spend a fortune on this or go to tons and tons of classes.
But, I mean, be aware of the consequences of your decisions and be aware of the tradeoffs you're making when you're maybe playing fast and loose with stuff, you know, the same way we don't want, like, you know,
the stereo in your car, you know, has like maybe some USB update on it. It's like, well,
what if that accidentally gets triggered and you're not in the middle of a USB update?
It bricks your stereo and then there's nowhere to, you know, where do you, you know, what does that
say to your customers? You know, why is your system designed that way did you think this through i mean it's
a minor economic loss but it's still it's an embarrassment to the company it's you know it's
really kind of avoidable you know and it also you know speaking from personal experience it sucks
if you're in the beginning of a road trip and your stereo just decides to brick because you know the vendor didn't think through
how it does it's like when are you going to do ever you do a usb update on your
your stereo and why would it you know why isn't there like a button that you press to just say
reset it to factory default conditions like you know i you know i look at stuff like that like
what are you thinking it's like well they're not it's like okay okay i don know i look at stuff like that like what are you thinking it's
like well they're not it's like okay okay i don't have to they don't nobody told them they had to do
more than what anybody than what their job is today yeah and you know i i hope listeners know
that i i never have believed that your job is what your job is and you should be thinking about how it affects
everybody but let's go into something happier happy let's let's switch topics yes you do stage
improv well i yeah up and up until about 2008 i was performing on stage down in austin texas
and then i moved to chicago the capital of improv in America. You can argue that,
but it's going to be a hard argument to make. And I stopped doing improv because I couldn't
get downtown on a regular basis. But yeah, I started doing it in 2000 and never really intended to go on stage. But that's just how things wound up.
And it was really kind of a transformative experience for me.
Speaking as kind of the technical nerd, introvert type of person,
being able to kind of work through things that are kind of
uncomfortable and you know putting yourself out there socially and taking a lot of risks in a
safe environment to the point where you it's kind of second nature to take those risks
because you know that there's really no consequence to failure.
Which is an interesting kind of like reversal on the whole, you know, we're so worried about failure.
It's like, well, on the other hand, it's like some failures are not worth worrying about.
And in fact, some failures are actually good.
I can see how improv would give you that.
I mean, if you start a skit and it all goes horribly wrong
half the time that's what the audience wanted oh yeah i mean there's some you know it's but
there were also people i mean friends of mine who just couldn't come to see a show and and i'm i'm
assuming it's not out of quality issues but but um that they actually it would make them very uneasy watching people take risks like that.
Because they feel so bad for this.
Oh, that's going so horribly.
Or I'm on the edge of my seat because it just makes me nervous practice of working with other people and paying very close attention to other people and the give and take of control.
The realization that being humiliated doesn't actually kill you and it doesn't last that long.
No.
Especially if you just keep repeating it.
Yeah, exactly.
Well, there's the rule of threes
um you know you do something funny once and it's funny do it a second time it's funny but not as
funny do it a third time it's hilarious yes yes and i and i think there i think there are you
know higher order resonances of that because i think by the time you get to 9 or 27 it is excruciatingly hilarious um i i've watched some some uh some people from la do that and it's
just like you're not going to keep doing that are you yes you've totally committed to it and you're
just like they're not no they are and it just it's yeah it's it's pretty i mean maybe it's it's kind of a
it's not so much of an inside joke i mean it might be i mean i i don't know that
too many people would be let's watch how long they're gonna do this but i mean they they played
it off really really well um everyone who made it all the way through the cat episode understands exactly what you're saying. out there, you know, socially to, you know, to not worry about silence, you know, listen for pacing,
you know, how much are you talking? How many people are in this room? Three? Okay, then you
should probably be talking about a third of the time. And you need to kind of like be listening
for that. You know, if you find yourself continuing to talk like maybe you maybe
maybe you need to stop for a moment or check in with other people um so how has it affected your
job do you make more jokes to get people to do what you want or do you is it just easier to
introduce yourself to people it's it's a lot easier to introduce myself.
You watch to kind of see.
It's like, is anyone talking to that person?
Well, go over and introduce yourself to that person.
Just pay attention to the atmosphere.
Pay attention to other people um and you know it's it's not it's not
so much all about me it's all it's about kind of us find out what needs to happen and do that
um but also you know a lot of us don't be you know don't be afraid to
speak up or take risks and you know and that's where there's sort of an
issue with um you know some stuff is great in an improv class where there's no consequences to
failure well in real life there are consequences to failure and there are consequences to speaking up so you have to make you know make a you know
a judgment of whether you know i mean you can't just go into life like it's you know an improv
class because that's going to get real ugly real fast um yeah but but you know but the other you
know the other thing is you know one of the one of the great things that I got from the whole process was this notion of status, of the king and the peasant.
Who's in charge?
Who's got the status?
And the games that people play, friends will regularly undercut each other and mess with each other.
And if you watch who has the high upper hand or who has got the higher status, you know, that will shift very rapidly up and down, back and forth between people who are equals.
You know, they're not completely level, but it will, you know, you'll watch one person rib one person and the
other person will you know accept that and they'll just give as good as they they get whereas you end
with other lopsided you know arrangements you know uh you know the boss and and and employee type of
you know situations but you know once you under you know you can you can play some
really interesting games with that you know like well what happens when the the king is taking
orders from the servant you know and you know that kind of like role reversal and you know
there's comedic effect to that but there's also practical effects to that as as well because you
know one of the one of the techniques people use um to kind of puff themselves up is to push other people down. this sort of thing you know you may be victimized by the situation where somebody's like you know
it's like oh you know well you know our it's like well yeah our code's delayed but it's not
delayed over you know it's as delayed as team b's which is you know got all these other problems
it's like and then now you're in a reaction role if you're on team b it's like wait what why is this
guy you know you know saying bad things about us and making us look bad? And now I'm angry and upset and I'm trying to react to that and I'm reacting emotionally. Whereas if I understand that team A is behind and this guy is trying to save face by making us look bad, it's like, okay, I see what you're doing. And I to react, you know, I'm going to choose my reaction.
So there's a certain amount of self-control and, you know,
it takes a lot of the emotional sting out of it,
especially once you know what's going on and these are common tactics that
people use. And I mean, this is not meant to be, you know, manipulative.
I mean, it's just understanding a lot of, you know,
basic social things that we, I'll speak for myself, that the introverted technical nerd, it's like, well, if I was that gregarious to start out with, I might not be doing this job, but I'm more comfortable with the machines and the math.
I'm more comfortable with that than the interpersonal stuff.
And once I kind of understand the interpersonal stuff and how it works, it takes all of its power away.
And then I can kind of stand my own ground and choose my own reaction.
And it doesn't mean that I'm necessarily um overly assertive or overly aggressive you know there are times where it's like well maybe you know maybe
i'll just you know let let somebody do that it's like okay well it's not worth fighting over but
i'm not going to get upset about it and i get to choose my own reaction and that just leads to a lot less stress on me you know i'm in control of me if if
i'm not necessarily in control of the situation i at least understand the situation so that kind
of stuff is is i think super um helpful i mean not necessarily just my environment but i think just
any workplace environment so sounds like stage improv can give
you a leg up if you are not a political animal and want to know what's going on in your organization
also it could be helpful for new managers trying to figure out all of these weird cues people are
getting that they are not not happening right yeah yeah because so many so
many managers are engineers who are promoted except if you don't get the training it is pretty
bewildering as to what i mean these people all want things from me how do i help them and how
do i say no and if you don't know how to do both and delegate to I don't know
it sounds like stage improv could be really helpful it's you know I I it's it's easy to say
it's a it's kind of a panacea and cure-all for things I mean it's you know it can be great
therapy it it it can help with a lot of um you know just under I mean just being able to play through things and be a total jackass
to somebody else like just a mean-spirited bastard to other people you know when when that's like
totally counter to your own personality and then the other half of it of of like when somebody's
making life bad for you that you intentionally go out of your way to make things worse how could i make the situation worse for me it's like okay well it's also raining you know i mean things like that and
just like real you know being able to you know kind of manipulate that and put yourself through
that in a in a you know kind of a safe you know safe venue where where there aren't any consequences
to to failure you know play playing through that stuff mean, when do you get to play like that?
That does sound like fun. You're making it sound like a lot of fun.
Well, it, I mean, you know, again, I'm not getting commission, but I mean, it, I mean,
your mileage may vary, but, um, I got a lot out of it and I was very, um, you know, I'm very happy
with it. I'm still, I mean, even though I'm not a, you know, actively i was very um you know i'm very happy with it i'm still i mean even though
i'm not a you know actively performing um you know i highly recommend it you know just actively
performing on stage you may actively be performing in a day-to-day way yeah i mean well i mean the
thing is you most of your day isn't scripted anyways so i mean you you improvise every day so you might as well get good at it
cool well i think we are about out of time and i wonder if you have any thoughts you'd like to
leave us with um yeah it's actually a good dovetail with the improv part because it's like
you know cut yourself some slack and don't be afraid to fail. Find yourself some opportunities to push yourself to beyond where you're comfortable
and let things fall apart
and then smile about it,
accept that it broke and move on
and get used to that
because the only way you're going to learn,
learning happens through failure and and it's an active process and the and i i know at least from
my electronics hardware projects and early in in life of them either starting on fire getting
kicked to death um you know now that i now that i can accept failure it's a lot
easier for me to do you know things that i just i wish i could have when i was you know 20 years
younger you know the electronics doesn't work okay well that's on fire okay well time for a new eval
board sometimes it is best to be a duck to have the water just go right off yeah i mean before i
get all metaphorical and stuff um we you have a giveaway just not for your company just because
you want to give away some things no i love the show and i like the audience or i love the audience
and i like the show some combination of like and love of audience and show and i like the audience or I love the audience and I like the show. Some combination
of like and love of audience and show. And I like these books. So, I am giving away three books that
I very much enjoy. And I will let you figure out who they go to.
Well, I want to talk about the books, but before we do that,
I would like you, Bob, to think of a number
and Christopher to think of a number
and me to think of a number.
Don't tell me the number.
Is this bounded in any way?
Actually, that's what I'm going to say next,
is do you want to bound your number?
No.
All right, me neither.
Okay.
Bob, unless you want to bound your your number we're not even going to bound
them they could be anything no we're not programming an ada here okay so you um listeners
send us a number a number any number one of the numbers that exists. There's a few.
There's a few.
And send it to us by October 1st,
and you will win a book.
And I guess the book that I will be giving out will be, I guess, Atomic Accidents,
whoever gets my number.
Bob tells me it's written by someone
who really knows the technology and history
and does a fantastic job explaining complex failures in an engaging way without resorting to fear-mongering and hyperbole.
It's a book about the subject that Bob would have written if only this person, Mahafi, had not done first.
Chris, you're going to give out the second book,
Whoever Gets Your Number,
and it's Safeway, Safeware, by Nancy Leveson.
I'm doing a really good job with names.
Maybe you should read that one.
Leveson.
Safeware.
Let's see.
It is, even though it's 20 years old,
it is full of amazing insights for delivering safe,
reliable systems and ways of looking at the organizational context in which
these systems are built and used.
Even if you aren't building safety critical systems,
it's a fantastic resource and thought provoking.
You're really sad if Christopher chooses his own number.
And Bob,
the one that I have left to you for your number i i am uh giving away well i'm
giving away all of them but the one i am presenting um is uh every anxious wave by modavio um it is a
it is the only fiction book on the list um involving indie rock and punk rock and time travel
um and love and loss and redemption and finding things you didn't know you were looking for
um and we only have a couple more days this summer so get your last bit of summer fiction reading in and full disclosure it is
written by my lovely and talented ex-wife so i'm super super proud of her and the book gets really
good um reviews and um and i'm in there somewhere maybe not directly but I know I'm in there.
That
sounds like fun.
I'm tempted to go get all of these myself.
Our guest has
been Bob Apthorpe, Senior
Nuclear Engineer. Thank you so
much for being with us. Oh, thank you for
having me. I love it.
Thank you also to Alan for suggesting
Bob and even reminding us both. He also suggested some questions, so we really love it. Thank you also to Alan for suggesting Bob and even reminding us both.
He also suggested some questions, so we really appreciate it and hope you got most of them
answered. Although we skipped a large section, so maybe someday we'll have to have you back to talk
about testing and standards and whatnot. Thank you also to Christopher for producing and co-hosting. And of course, thank you for listening.
You know we have a blog, Embedded FM slash blog.
Did you know that Andre has started a new series about STM F something?
Discovery One boards, the F4 Cortex MF4.
This is really going badly.
I should have planned ahead on this part.
But it is a good series.
It is far more organized than I am today.
And Chris Feck will be returning with his MSP430 series
as soon as he gets done, I don't know,
vacationing or some nonsense.
We also have a YouTube channel for these podcasts
so you can stare at our wonderful logo
while you listen to the audio at 2x speed. I don't know. Somebody wants it. People are listening. I
don't know. The contact link on embedded.fm will direct you to the YouTube channel. It will let
you email us, or it will let you sign up for the newsletter, which has been marked as spam for the
last two weeks, but it's finally fixed and should go up on, I don't know, Tuesday or something. Now, a final thought. Let's go
with Mark Linus from The God Species, a book about Chernobyl's disaster.
At 1.24 a.m. on 26 April 1986, Chernobyl's Unit 4 reactor exploded
after staff disabled safety systems
and performed an ill-advised experiment
to check, ironically enough, the reactor's safety.
Embedded FM is an independently produced radio show
that focuses on the many aspects of engineering.
It is a production of Logical Elegance, an embedded software consulting company in California. Thank you.