Embedded - 371: All Martian Things Considered
Episode Date: April 29, 2021Doug Ellison (@doug_ellison), Engineering Camera Team Lead at NASA’s JPL and Martian photographer, spoke with us about low power systems, cameras, clouds, and dust devils on Mars. The best paper for... learning more is from NASA’s JPL site: The Mars Science Laboratory Engineering Cameras Mars rovers wiki
Transcript
Discussion (0)
Welcome to Embedded. I'm Alicia White, alongside Christopher White.
We're on Earth, and this week we are talking to Curiosity rover from Mars.
I don't think that's right.
Well, the time lag is going to make it tedious.
So let's talk to Doug Ellison, the Mars Science Laboratory Engineering Camera Team Lead from NASA's Jet Propulsion Laboratory, while we wait to see if Curiosity is on the line and ready to chat.
Curiosity will be here in about five minutes' time.
Hey, Doug, how are you?
I am very well. How are you two?
Good.
Could you tell us about yourself as if we saw you on a panel about Martian things?
All Martian things. All Martian things considered. So I am the Mars Science Lab
engineering camera team lead, which means I get to take pictures on Mars with the Curiosity Mars
rover and also look after the team of people who do that. We operate our engineering cameras. They are the cameras we use to have a look around,
check the lay of the land, see where we want to go and point our other cameras.
They are how we don't get lost when we're trundling around Gale Crater with the Curiosity Mars rover.
We do lightning round where we want to ask you short questions and we want short answers. And
if we're behaving ourselves, we won't ask you for all the technical details. Are you ready? I am very ready.
Let's go. Marvin the Martian or Martian Manhunter? Marvin the Martian. Mars or Europa?
Why not both? Favorite fictional robot? WALL-E. Is faster-than-light travel possible?
Unlikely.
Color or black and white?
Black and white.
Which is harder, designing waterproof things for terrestrial deep-sea exploration or regolith-proof things for Mars?
Regolith-proof things for Mars.
Mars dirt wants to ruin a lot of stuff very quickly
uh do you like to complete one project or start a dozen
i like to have a dozen irons in the fire all dedicated to one big project
do you have a tip everyone should know
try and surround yourself with people who are cleverer than you are it's never a bad idea
i totally agree with that okay that's easy for me this was going
this this was going to be a lightning round question but since you said it what you said
about the regolith i think maybe i want more detail what is harder to deal with on mars
for the electronics?
The ESD, the radiation, the cold, or the dust, or something else?
Yes, it's all of those.
It starts even on Earth.
You've got the tangible shake, rattle, and roll of a rocket launch.
Go and watch the footage from onboard old space shuttle rocket launches,
and those astronauts are getting rattled around inside that. You can see their heads shaking.
There are huge G-forces to survive. And so all of the electronic boards in and of themselves have
to survive that ride into space in the first place. And then you've got a nine-month cruise
to Mars in deep space. And so you're outside any magnetosphere,
you've got cosmic rays hitting you every now and again.
Then you've got the landing at the other end.
That's typically not as harsh as the launch was,
but the g-forces can peak pretty high.
And then finally, where your electronics were designed
to do their job in the first place, you're on the surface.
And now you've got the daily temperature swings,
which can go... The coldest we're allowed to use our cameras is minus 55 centigrade we will heat them up
regularly in the morning to get them that warm you know mars is an incredibly cold place
and so we spend a lot of energy just making stuff hot enough to not break um the radiation is not too bad um we don't tend to see a lot of our
pictures getting damaged that way we do see occasional cosmic ray hits through some of our
pictures and things like that but for us the thing that really bites us is the freezing cold temperature
um it's it's caught us once or twice where our cameras have not been warm enough um first thing
in the morning when taking some pictures and the the rover said, no, I tried heating them up, not warm enough, no pictures for you, try again tomorrow.
And so I'd say it's the cold.
Listen, you talk about the vibration impacts.
It brings to mind, I've used telescopes in the past for amateur astronomy,
and they're very sensitive just to being shocked and, you know,
you have to recolumate things if you bump it the wrong way
or look at it the wrong way. It seems like the optics would be really sensitive to being jostled.
Is that a major problem that you have to do something special for? Not really the optics
themselves, but one thing we have had to take a look at on a couple of occasions is exactly where
our stereo pairs of cameras are in relation to one another we do a
thing called like a thermal characterization stare test where we will stare at the same patch of
ground from the earliest time we can in the morning to the latest time we can in the afternoon
because the kind of the bracket upon which all our cameras are bolted to it can move things can
creak as the temperatures change during the
day. And if those cameras aren't exactly in the same position relative to one another,
then when we generate stereo data from those cameras, it can change. And we actually saw
quite significant changes. And so we've had to develop not just camera models for the cameras,
but temperature-dependent camera models that change ever so slightly as the temperatures go from, you know,
minus 50, minus 20.
I think the hottest one is like plus 10 degrees centigrade,
just because a chunk of metal getting exposed to 100 degrees C
of temperature range is going to change.
It's going to move, and our cameras move with it,
which makes things pretty tricky.
Because the cameras that you've worked on are the engineering cameras
that are on the top of the mast, right?
Yeah, so we've got two kinds of cameras.
We've got navigation cameras that are up on top of the mast,
and then we have hazard cameras, front hazard cameras and rear hazard cameras.
Think of them as like the backup camera on a car.
We've got to set out the front and we've got to set out the back.
But we actually have, of all those three kinds of cameras,
nav cams, front house cams, and rear house cams, we have stereo pairs of each.
And of those stereo pairs, we have one complete set tied to one of our flight computers and one complete set tied to our spare flight computer.
So in total, we've actually got four cameras up on the mast, four at the front and four at the back.
And then up on top of the mast, we've also got our friends from the science team.
So we have the color mast cam cameras and the chem cam laser spectrometer as
well. So you have your backup computer and you have some going to your backup computer.
They don't all go to everything, right? Like only half the cameras at a time are useful.
Yeah, that's right. So we landed using, imaginatively enough, the A-side computer.
About 200 days in, we had some trouble with it, so we swapped to the B-side computer.
And in doing so, you swap cameras with the engineering cameras. You swap from the A cameras
to the B cameras. If you look at one of our selfies that's actually taken with our microscope,
which is called Mali on the end of our robotic arm, if you look at the selfies we take,
you'll actually see two pairs of engineering cameras kind of hanging off the bottom of that camera mast.
The top pair are on the A computer, the bottom pair are on the B computer. And so we've been
on the B computer for the majority of the mission. And so the A side engineering cameras don't get
an awful lot of love. They don't get used very often. And it really is just like a hard switch
over one or the other and only done a couple times per mission.
Yeah, it's done only when you absolutely need to.
So we had some A-side computer flash memory issues.
We swapped over to the B-side.
And then we had basically a kind of a file system problem with our B-side computer a couple of years ago.
So temporarily we swapped back to the b the a side again but when we did we discovered actually the the a side was really not in great shape and probably
only useful as a lifeboat in the future so we from from the a side we can hook over to the b side we
did some diagnostics and then we swapped back to the b side and we've now since done a a nice flight
software update called our hope which basically turns our our cea a computer into a lifeboat
should we need to go and live on there for a while if something happens to be in the future
uh so percival no perseverance percy sorry um percy is the rover that just landed recently
and it's got the helicopter ingenuity um. And Curiosity is a very similar sort of rover, right?
Very, very similar.
I mean, if you didn't know them well enough
and you saw them parked side by side,
you could be forgiven for thinking they're twins.
They are very, very similar.
Their flight computers are the same.
The landing system is largely the same.
Their mobility system is largely the same.
You know, one's just the slightly younger sibling, and so it comes with a few new toys.
And so, at long last, this kind of tradition of one-megapixel black-and-white engineering cameras
has finally been superseded with the enhanced engineering cameras on Perseverance.
And so, as you can imagine, the team in Volperating Knows is the team that used to operate the engineering cameras on Perseverance. And so, as you can imagine, the team in operating those
is the team that used to operate
the engineering cameras on Curiosity.
I'm holding the fort back with Curiosity,
but we like to mock each other.
And so the kind of the bougie Technicolor
Jezero crater crowd
versus the more Ansel Adams style,
one black and white megapixel at a time
down at Gale crater crowd.
But it's actually fascinating to see.
It's about time we got an upgrade.
But we still do great stuff with our one-megapixel cameras.
We got to enjoy a twilight cloud season with Curiosity this year,
whereas Perseverance wasn't quite ready,
so they didn't get to enjoy the twilight clouds like we have.
So there's some friendly rivalry, but we're doing great stuff on both sides of the planet the cloud pictures were
just incredible but the animations it just blew my mind more than almost anything else i mean
everything blows my mind from mars but for some reason seeing those high cirrus clouds was like
oh that's very earth-like it's a planet oh okay it's a planet it's a real planet yeah yeah it's
easy to forget it has an atmosphere i mean mean, it's not a great atmosphere.
Right.
Frankly, it sucks. But it has an atmosphere. It has a water cycle. It has a carbon dioxide cycle.
And so at one particular time of year, and it's kind of late fall going into early winter, we get these twilight clouds.
And what's amazing is how quickly they show up, how spectacular they are for a month or two and
then they're just gone they vanish and so while they're around we try and put into the plan these
little blocks of let's go and take a look at the twilight clouds you might learn something
and then you know one day we put them in and just nothing shows up just clear empty skies just gone and so for this martian year at least um
twilight cloud season is over but we we had more luck this year i think than we have in any previous
year we we taken the time to take the lessons learned from last martian year and redesign some
of our observations made them more efficient made them quicker use them more often and we got some
spectacular results so we're all like okay we this is the best cloud season we've had yet.
And when you say last Martian winter, that's like two years ago for Earth.
Two Earth years ago.
Roughly speaking, a Mars year is two Earth years or about 360-ish.
You get 360 Martian days in one Earth year,
and something like 660 Mars days in a Mars year.
It's about twice as long.
So when we talk about seasons, we're talking about, you know,
seasons as we experience them on Earth, but they all last twice as long.
And Curiosity landed on Mars in 2012.
2012, yes. was august 2012 um and the cameras that it has because i'm trying to focus on that instead of everything so i just want to say mars everything tell me
everything um but focusing on the cameras those uh had a lot in common with the ones the previous
ones spirit and opportunity is that right yeah and in turn that the ones, the previous ones, Spirit and Opportunity. Is that right?
Yeah. And in turn, the cameras on Spirit and Opportunity, largely speaking, were inherited
from what would have been the 2001 lander that was cancelled in the wake of the twin failures in
1999 of Mars Climate Orbiter and Mars Polar Lander. There was going to be a 2001 lander that
had an instrument called PAMCAM that was going to be a 2001 lander that had an instrument called PanCam
that was going to have these one megapixel cameras
that then would have color filters in front of them
to build up color mosaics using red, green, and blue filters
and a variety of other filters to do kind of cool science stuff.
And when that mission was canceled, they went, well, what's next?
And those PanCam sensors were inherited by what became pan cam on spirit
and opportunity as part of their payload and for simplicity for ease of use for commonality for
ease of testing and design and stuff like that and frankly to save some money they went well
let's use the same sensor and the same electronics on all of the cameras so the high resolution color
science cameras the navigation cameras the hazard cameras, the navigation cameras, the hazard
cameras, the microscope, they all use the same one megapixel black and white sensor.
And when Curiosity came along, when it was being designed, its design really wasn't starting not
long after Spirit and Opportunity hit the ground. And so they went, well, what have we got that
works? Ah, these engineering cameras will be fine. They work with Spirit and Opportunity, whatever.
Sure, we'll have some new shiny color science cameras up on top, but engineering cameras, you get the same stuff.
Now, we did some mild upgrades.
There's some extra heaters in there to make them a little easier to heat up in the morning when we need to.
But they're largely speaking the same.
And so, weirdly, I actually certified to operate the cameras on Curiosity first.
Then I went back to Opportunity and did the last kind of 18 months of Opportunity's mission
operating those engineering cameras because they're so very, very similar. And when it came to
Perseverance, the cameras got a big upgrade for a couple of reasons. One is we can get more data
home from Mars on a given day these days. And the other is, we kind of run out of the one megapixel black and
white stuff. But I am delighted there is one of the retro MER, MSL fan club black and white sensors
left, and it's on an upward-facing sky camera on the deck of Perseverance. So we're not dead yet.
We're still spitting out these one megapixel sensors when we can.
Why has the bandwidth gotten so much better?
And how much bandwidth are we talking?
So, yeah, it's easy to think in terms of data rates, but actually it's more data volume.
So we very rarely, very, very rarely, in fact, send data back from our rovers direct to Earth.
We have a high gain antenna that can send data straight back to the Earth,
but it takes an awful lot of power, and its data rate isn't very high.
So what we do is we use that antenna to send the commands to the rover,
but to get the data back home, we use the fleet of Mars orbiters.
Now, when Spirit and Opportunity landed, that was just two old orbiters.
It was Mars Global Surveyor and Mars Odyssey. When Curiosity landed in 2012, we had the Mars Reconnaissance Orbiter, a much bigger,
more powerful spacecraft. And since then, we've had more orbiters arrive. We've had the NASA
MAVEN mission. We've had the European ExoMars TGO mission arrive as well. And they've all been
equipped with these UHF relay radio sets on board to receive
signals from the rovers and then send them back down to earth so with spirit and opportunity
we were lucky to get 50 to 100 megabits right 12 and a half megabytes per day right 100 megabits
would have been a huge day for opportunity. With Curiosity, when we
landed, we had Mars Odyssey, the old one still going from 2001, hence the name, and we had the
Mars Reconnaissance Orbiter, this newer, shinier, better radio, and could get maybe 300 or 400
megabits once or twice a day. And so our daily data volume has gone up from maybe 50 to 100 to maybe 300 to 600, 700, maybe something like that.
Three or four times more.
But you have to share it between the different rovers.
Yeah, you have to share it with all the instruments on board our rover.
We have to share it with the color science cameras and the instruments inside and stuff like that.
And engineering, housekeeping data, things like that. By the time Perseverance landed,
you've also got the European ExoMars TGO spacecraft.
You've also got MAVEN.
And they have, again, newer, shinier, better radios.
And so now, I mean, we're getting more data home
from Curiosity every day now than we have ever before
because we have more orbiters that can do relay.
So we have more relay passes per day can do relay so we have more relay
passes per day which means we can get more data home and so nowadays we can get a thousand megabits
a day which is 125 megabytes a day roughly that's kind of average and it will come in lumps it comes
in these 12 minute passes was the orbiters fly overhead and anything between 50 and sometimes, you know, a thousand megabits in a single pass when we're lucky.
And so the thing we've often struggled with,
which is the amount of data we can get home for curiosity,
isn't so much of an issue anymore.
We don't tend to worry about data volume on the day-to-day anymore.
Well, that's kind of cool.
You said you operate the cameras.
What does that mean?
I assume you don't have a very long stick that you push the button for.
It's a gigantic selfie stick.
It's 120 million miles long, right?
And it's got a button at one end, a camera at the other.
It's easy.
So we'd love to have a joystick and a button, right?
That would be great.
But because of the speed of light, at at their very closest it's about four minutes for
a signal to get from earth to mars and then four minutes for it to get back right and when they're
on opposite sides of the solar system it's more like 20 plus minutes each way so we don't operate
them in real time we basically send the rover a to-do list when it wakes up in the morning
and then we get the data home from it doing those things
in these lumps through these relay passes through the orbiters. And then we take the last one of
those passes that happens before we go into do the next planning cycle. And we go, okay, that's
where we've got to, what are we going to do next? And on the engineering camera side, what we're
looking at is we have some science observations we do.
We do things like looking for clouds, looking for dust devils.
We also do documentation of whatever we're doing with the robotic arm on the rover.
So every time we get our microscope out or our spectrometer on the end of the arm out,
each time we put that on a new target, we'll document it with the engineering cameras.
And then whenever we drive, we take pictures as we're going, because the rover actually uses those to measure its own progress versus how far it thinks it's got.
Kind of like a very, very slow, gigantic optical mouse, basically.
It's called visual odometry. and then at the end of every drive, we basically, in a series of different chunks, we take a complete 360 of where we've got to so that the next shift we can go,
okay, this is where we are now.
Where do we want to drive next?
What do we want to reach out and touch the robotic arm?
What do you want to go and look at with the high-resolution color science cameras
up on top of the mast?
And taking those pictures is basically little snippets of code.
They are human-readable.
They're more like a scripting language.
It's actually been abstracted by the flight software,
so it's pretty easy to write these commands to tell the rover to do stuff.
But it's take a picture.
It's this important.
Use these cameras.
Point it in this direction.
Crop it or shrink it.
Then compress it, and we're done.
And it might be just one image.
It might be a mosaic of five or seven or 12 images.
And a planning cycle is basically ingesting all the requirements
for the engineering cameras for that given plan,
turning those into commands, making sure they're good,
submitting them to the system.
They all get their models together in one big giant simulation together. If that simulation ends the day right side up, we assume it's good, and we send it to the system. They all get their models together in one big giant simulation together.
If that simulation ends the day right side up,
we assume it's good,
and we send it to the robot.
You said one of the commands is to compress it.
If you're doing imagery for science,
I imagine you want the absolute raw pixel data.
There's lossless compression, I assume you apply,
but do you ever say,
give me a 50% JPEG real quick?
So we have two modes.
The color science cameras actually have a percentage of JPEG compression.
And if they want to do uncompressed, the command they use is 101.
But with the engineering cameras, we have two modes of compression.
One is called LOCO.
People think it stands for lossless compression.
It's actually low complexity compression. One is called LOCO. People think it sounds like lossless compression. It's actually
low complexity compression. It's a very, very quick algorithm that's typically speaking,
it will return lossless data, but it's shrunk by about 20, 30, 40 percent. But we don't know
exactly how well it's going to compress before we take the picture, right? We don't know exactly
what's going to be in. What is the ground going to look like just over that hill? Is it going to be full of lots of high-contrast,
high-frequency detail
that's not going to compress great?
Or is it going to be fairly bland
and it'll compress really well?
You just don't know in advance.
So we guesstimate something like
eight bits per pixel, roughly.
But we don't do that very often, actually.
We tend to use ICER.
It's a different compression algorithm.
It's distantly related to JPEG 2000.
And we assign the number of bits per pixel.
We say, okay, typically we'll use three to four bits per pixel.
And that's, to the human eye, indiscernible from the lossless compression.
You can't tell they're more heavily compressed.
But we get a lot more pictures home by doing it that way.
And they generate good 3D data on the ground. They're good enough for doing things like dust
double surveys and cloud surveys and things like that. And we will literally assign between two
and four bits per pixel for most of the images we take. And then the flight software will shrink it
enough to fit under that limit. But if for some reason it can actually return it losslessly,
spending less bits than that, it'll do it. But that very, very rarely happens. And so
two to four bits per pixel. And so because it's a one megapixel camera, we can go, okay, great.
Four megapixels, sorry, a one megapixel camera, four bits per pixel in stereo. Well, now you just
spent eight megabits of data. That's one megabyte for a stereo bear.
Have you ever sent the command and then gotten back bad pictures?
Like maybe you left your finger over the...
We have.
So in the best way that the rover can leave its finger over the lens, we've done that.
So we have a robotic arm.
You can think of it as our arm. And it has fingers on the end of it. You know, it has its finger over the lens. We've done that. So we have a robotic arm. You can think of it as our arm.
And it has fingers on the end of it.
You know, it has different instruments on the end.
And one of the observations we do quite frequently
is called the Dust Devil Survey.
We take, we do eight different pointings
to get a complete 360 degree view.
And each of those pointings,
we will take three images in rapid succession.
And then the scientists
can look at those three images see if anything changes if it changes it's things like clouds
dust devils and so on and so forth and if the robotic arm has been used to do something and
it's left parked out in free space somewhere it's going to be in the way in one of those
and that's just a hit we take you know it's it happens um we also do things like um you know we've driven it's fairly late in the
afternoon and we've sequenced taking our post-drive imaging and we'll get the sun in our eyes we'll
get lens flare we'll get lens glare um we have on occasion tried to take pictures after dark
and as long as it's not too long after sunset it's okay but sometimes if it's a little dark
the exposure times can kind of explode and suddenly an image that should take 30 seconds it's still trying
after five minutes and the rover goes you're done good night um and so we're not perfect and we have
occasionally left our finger over the lens um but we have things in place to ideally prevent us from
doing that kind of stuff have any of those ones that were sort of an
accident turned out to be scientifically interesting because they were off the normal path
um we one of our observations we do is called our nav cam sky flats every month or so about an hour
before sunset we actually take a picture of what should be an empty patch of sky directly opposite sunset.
Should be.
Should be.
And we take a lossless set of images in the middle, and then we shrink wrap that around the outside with some shrunk images.
And basically, you generate an artificial gradient from the shrunk images around the outside.
You've got your big image in the middle.
The difference between them is your sky flat. It's characterizing the optical response of the entire stack from dust on the lens through to sensor sensitivity and
stuff like that. And at about the same time that the twilight clouds vanished, we did our sky flats
and our flat field was full of clouds. It's the skyflats ever. They're beautiful, and they are absolutely useless.
And so we've literally said, okay, we can't do skyflats until the cloud season's finished,
and so we're going to try again in a couple of weeks.
We may get them actually kind of beginning of May, I think.
But yeah, we've had things like that.
It's like, we wanted an empty patch of sky, and Mars just stuck some clouds in it for fun.
Do you do dark frames as well?
That happens as we take the pictures so we don't have a mechanical shutter what happens is the sensor gets flushed we expose we read that out and then we then read out a zero length exposure
and so you're getting kind of dark field and the readout smear from a zero length exposure
that gets subtracted from
the image you just took and that's what gets sent home we have actually sometimes done zero length
exposures to do things like take an image of the sun with our navigation cameras and just the
readout smear is enough for the science team to figure out how bright was the sun how bright
should it have been and the difference between them is how much dust is in the atmosphere.
Oh, wow.
Yeah, it's called taking a tau.
It's atmospheric capacity when it comes to Mars.
You've mentioned dust and you mentioned dust devils.
Is it true that the dust devils are Martians that come and clean your lenses for you?
I wish they would.
We've tried leaving tips, nothing. So our cameras, generally speaking,
over eight years, have just been getting ever so slightly, ever so slowly dirtier. The front optics
have a little coating of dust and it's at its worst really when we have sunlight actually on
the front of the lenses because then the
brightness of that dust is contributing to the image signal that ends up at the sensor
and we actually have little lens hoods that stick out from the front of the light of the cameras
and if there's a shadow of that lens hood cast across the front of the camera then half of it
has illuminated dust contributing to the picture and half of it doesn't. And so you end up with this weird, bright patch on the image, which can be really tricky.
Towards the end of Opportunity's mission, when those cameras were 5,100 days old, that was a real problem if you're taking images late in the day towards the sun.
With Curiosity, it's not so bad right now.
It's something we're kind of tracking it's you know we may end up having to have guidance of hey if you want to take pictures that are important in terms of planning the next drive
try not to take them too late in the day because you're going to have this issue of
the front of the lens being half illuminated and half not and getting these bright patches in the
images the good thing is it happens to both the left and right eye together about the same and so
it still makes good stereo data so it's not the end and the right eye together about the same. And so it still makes good stereo data.
So it's not the end of the world.
But that's why we take those sky flats.
We actually look to see, are we accumulating big chunks of dust?
Are we losing big chunks of dust?
And we see it come and go.
But generally over time, they're just getting dustier and dustier.
And those flats are divided at home, not locally on the rover.
So you have to send those back.
Yeah, they get sent home.
And then we have a science team member called Mark Lemon.
He's kind of our dust devil and cloud guru,
but he's also the genius guy of making our sky flats.
And we basically build up a sky flat
and then kind of migrate over the last couple of sky flats
and kind of keep slowly improving it over time.
But of course, if you take a skyflat now,
it has no value whatsoever for the images you took the day you landed.
And so we kind of slowly migrate the skyflats as we move forwards.
When there was that dust storm that killed the Opportunity rover
a couple of years ago, it hit Curiosity too.
It got murky, it got dark, you know, it was pretty horrific.
And at the end of it, we saw there was a bunch of credonar lenses dark you know it was pretty horrific and at the end of it we saw
there was a bunch of cred on our lenses you know more than usual but a lot of that has actually
blown away since kind of the worst of it splattered onto the lenses and then slowly blew away in the
following days so this isn't i mean people might think oh you have cameras you take pictures and
then you download them and you look at them and you post them on the internet. There's a lot of post-processing that has to be done on Earth
to generate something that can be looked at or analyzed, right?
So the amazing thing is, and people may not realize this,
is that if we were to get a downlink pass from one of the orbiters tonight,
there are no humans in the loop when it comes to getting those images processed
and on the web for the whole wide world to see.
It's completely automated.
In fact, if we were to do a drive with the rover,
let's say we drove Saturday night, for example,
by Sunday morning, if those images were on the ground,
they'd have been processed through our pipelines,
they'd have been thrown onto our servers for us to look at,
they'd have been thrown onto the web for the entire public to look at,
and we have another script that actually generates the mosaic of the 360,
and we have it throw that on the web as well for the public to enjoy as well. And in fact,
that's a kind of a PNG file. It doesn't have any compression to it. There is a lot of processing
involved in generating kind of radiometrically calibrated images for our science team and things
like that. And then one of the things we do when we push kind of raw JPEGs for the public is we
stretch them so at least you can see something, right right we just do a basic stretch on them so that you're
not looking at something that's way too dark or way too bright you know you can actually see
there's rocks the sky there's bits of rover and stuff like that but it's entirely hands-off where
the humans get involved is when we do a downlink assessment which is the morning every time we do
a planning shift the morning there is also a downlink assessment the team called opgs the operational
product generation system i jokingly refer to as overlaying pictures with glue and scissors
generates a bunch of specific products specific mosaics that help with planning for that day so
there's one that shows the chem cam team where they can shoot their laser.
There's another that shows the color Mastcam team
what pictures they've already taken
and so what's left to have a look at
with the color cameras and things like that.
And then meshes get generated
to help the rover planners where they're going to drive next.
And that's the kind of the hands-on processing
that happens on the ground.
But pictures received on Earth, processed through the pipeline, thrown on our server, thrown out to the public, a quick mosaic made and thrown out, completely hands-off, completely automated.
With the meshes, and you said left-right, you do distance calculations.
Do you do VR systems?
I mean, I just want to walk through it.
So funny you should mention that. calculations. Do you do VR systems? I mean, I just, I want to walk through it.
So funny, you should mention that. I started at JPL doing education and communications, then I moved into a group at JPL called the Ops Lab. And the Ops Lab is like a little skunkworks
embedded within JPL that does kind of emerging tech and how can that be applied to mission operations and things like that and
one of the things we did about six years ago was team up with microsoft and they said hey we've
got this cool thing called the hololens we're developing got something cool you could do with
it and we went we could use it to walk on mars and so a project was born that ended up being called OnSight and OnSight lets our science team
put on a HoloLens and walk into our 3D data and the real genius behind OnSight is actually merging
lots of different data sets to make the best possible 3D mesh and so it starts with our kind
of Google Earth-ish quality orbital data it then overlays the 3D data from the navigation cameras,
our engineering cameras on top of that.
And then when we have color data,
it overlays that on top of that as well.
And so most of the time,
it's kind of a little black and white patch of terrain.
But then sometimes if we hang around somewhere for a while,
we'll have accumulated enough color images
for it to be a complete color 360.
Now, that experience is one the science team still use.
There's science team members with HoloLens in their offices, and they use it to walk on Mars,
wrapped in bunny rabbit ears, right? Walk on Mars. But we also made a spinoff of that that was called
Destination Mars, where we actually took three of our best terrain models. We holographically
captured actual in real life Buzz Aldrin to narrate these three terrain models.
And one of our rover drivers, Arisa Hines, we had her captured to talk about the rover and what we
do and where we go and stuff like that. And that was turned into an experience that was at the
Kennedy Space Center Visitor Complex for about three months back in late 2016. And that was
really, really fantastic. And then a spinoff of that is now available to the public. So you can
go to mars.asa.gov and in a menu somewhere, you'll find a surface experience where you can look at
those same terrain models in your browser. And then Google took that same data and they turned
it into a kind of a web VR thing that works in, you know, more affordable headsets as well. So
yeah, we take that 3D data and sometimes we walk into it.
What camera capabilities do you wish Curiosity or Perseverance had that they don't?
So, assuming I have carte blanche for data volume, I'd like to take more videos.
It would be nice to take video as we're driving. I mean, of people say i want live 4k video from mars oh my god my cell phone can take 12 megapixel pictures why can't nasa on mars blah
blah blah and the problem is it 99 of the time if you had a red 8k camera parked on a mars rover
filming the martian terrain and then you sent all that data home, you would see nothing happen, right?
There's nothing going on out there, nine times out of ten. The only thing that really happens is we
drive around, we use the robotic arm, we take pictures with the mast. And so it'd be nice to
see some of that as video. It would be nice to be able to take pictures more quickly with our engineering cameras.
If we're really lucky and we tweak a few dials
and press a few buttons,
we can take pictures about once every 15 seconds or so
with our nav cams and our haz cams.
It'd be nice to be able to do it a little bit quicker than that,
particularly with things like dust double searches.
But in many respects, Perseverance has it very, very good
because they have 20 megapixel color engineering cameras.
But because of the fact that the rest of the avionics
are heritage from Curiosity,
they take those 20 megapixel color pictures,
but then they have to read them out one megapixel at a time
and save them to the flash memory.
And they get sent home one megapixel at a time
because they were
designed with our old cameras in mind not these new shiny cameras in mind and so you can go and
see those pictures online as well and you'll sometimes see the whole thing as a one megapixel
color picture and sometimes you'll see as 16 tiles of kind of 1280 by 720 images but you've got to
try and assemble yourself to get the full glorious image but a
little higher frame rate maybe a few more videos would be cool um but really some more data volume
and honestly just more time in the day more power to use the cameras we've got more often would be
lovely the rover can do a lot with its own images it doesn't need the humans to tell it what to do for for a lot of things how much can it do
autonomously so we we're very careful about what we let the rover do on its own because let's be
honest it's a nuclear-powered rover with a which shoots lasers out of its mast and so we don't
want it to go rogue um we've all seen that movie it's it be terrifying. And so the driving is the thing where the imagery gets used autonomously most often.
So we can drive in different ways.
We can say, hey, turn all six drive motors on for the next 37 meters of revolutions and then call it a day.
We call that blind driving.
It's literally just hands over your eyes, drive.
But then we can do things like visual odometry, where the rover takes pictures
as it's going, and it analyzes those pictures, it looks for features, it matches those features from
one set of images to the next, and establishes exactly where it has got to compared to where
it thought it was, and then drives in little steps, like one meter steps, drive a meter,
take some pictures, see what's moved, drive another meter, and so on and so forth. And doing that,
we can drive actually very accurately. We can say, hey, we're going to drive to a point that's 37 meters east and 14 meters south of where we are right now. And we can do
that drive to within, if we're lucky, a couple of centimeters. It's pretty remarkable. And then
the ultimate version of that is where we do auto-nav, where we say to the rover, hey, go as
far as you can over there. You've got an hour know, an hour and 45 minutes, get to it.
And we'll typically do a bunch of kind of targeted visual odometry driving
to begin with, but then we'll set it loose,
and it will start taking pictures of the terrain in front of it on its own,
figure out which parts of it are safe, which parts of it aren't,
and then we'll just drive along the safe parts that get it towards
where it wants to get to, kind of like a mouse down a maze.
And again, it's doing little 3D meshes of the terrain in front of it,
checking for hazards.
And we get all those pictures home eventually as well.
And kind of little shrunk versions, we shrink them down to 256 by 256.
Otherwise, the flight computer would take weeks to try and process one stereo pair.
And the other thing we let the rover do on its own is um often on the second day of a two-day plan we'll
have driven on the first day and so we won't know where we are for the second day so we'll turn on
a thing called aegis a-e-g-i-s you can look up loads of papers about it by my colleague raymond
francis and aegis basically lets the rover take a pair of images
off to its kind of forward and right kind of off its right shoulder of the ground whatever the
ground is right in front of it off to the right take a picture look at that picture and analyze
it for jaggedy white rocks or rounded dark colored rocks or some variation in between
identify the middle of one of those rocks and then shoot it
with the laser beam and actually do what's called libs laser induced breakdown spectroscopy it's
remote elemental composition using a laser spectrometer and eegis will take pictures
shoot stuff and then save all that data and send it home to us so it kind of keeps a human out of
the loop basically and we've done a comparison of okay if we had had a human in the
loop what would they have shot in that picture and it overlaps with what aegis has done most of the
time and so aegis is like a little on board geological gnome that goes okay let's find a
cool rock to shoot i'm going to shoot that one and shoots it meanwhile we're planning what's
going to happen the day after that and we're not even aware of what the rover is actually doing. It's fantastic.
But a lot of the time for the bigger decisions, you really do want humans in the loop.
We tend to make the rover be quite cowardly.
We set pretty conservative limits on pitch and roll.
We set conservative limits on suspension.
We set conservative limits on wheel. We set conservative limits on wheel slip
and things like that
because we'd rather avoid
the weather getting itself into trouble
that we didn't have to go and get out of.
I have a dumb question.
As you're navigating things,
there's no GPS on Mars.
So how do you,
like you talked about merging the data sets
from the orbiter and then you're from the stereo cam to make the VR stuff.
How do you locate stuff to within a high degree of precision without GPS?
So we basically have the rover's ability to know its own progress based on, okay, starting here, go over there. It's really, really good by combining
this kind of visual odometry as it goes.
And we have an inertial measurement unit on board,
a gyro, basically.
And combining those two things,
it's really good at measuring its own rate of progress.
And often when it's done a 50, 60, 70-meter drive
at the end of it, it's within a couple of centimeters of where we told it to go.
We get all that data home, and then we do kind of a bundle adjustment.
We basically compare the images we've taken to the orbital imagery
and go, okay, the rover thinks it's here.
Actually, it's a meter and a half off to one side,
so let's tweak that position,
and we will then reset where the rover thinks it is by doing something called a SAP sun update.
We will literally take a picture of the sun, use it like an old-fashioned mariner checking the sun for his lat and long.
We'll do the same thing to recalibrate our pitch roll and yaw that then resets our IMU,
and we do that every 500 meters of driving or so.
And then it's humans in the loop on the ground
who take those kind of each leg of the driving
and the resets we do when we do those SAP sun updates
every 500 meters or so
and turn that into a geo-referenced route map
of where the rover has gone.
So Fred Califf and we call him Team Loco,
the localization team,
they do an amazing job of merging
all the different data sets,
kind of ground-truthing, you know,
against orbital imagery, where are we,
where do we think we are,
into a giant database called Places
that gets published to the planetary data system
so anyone can pull that data.
And we end up with kind of
centimeter to millimeter accuracy of, okay,
we did a microscopic image of that particular rock 475 sols ago, and we can give you the lat and long
of exactly where that rock is to like six decimal places because of it. It's doing it the old
fashion way. We don't have communication satellites, but we can still correlate what we know from orbit,
what we've seen with the rover, and make really, really accurate maps of where we're going, where we've been, and
where all our data is.
Do you ever get to take pictures just because you want to know what's over there, or you
think that will be a nice composition?
We'd love to get a little more Ansel Adams than we do, that's for sure.
It's rarely do we have time in a given
plan, and time really means power.
You know, the longer we stay awake, the more power
we're using, or data volume
to take stuff just because it's pretty.
I will happily
confess that I will sometimes
coerce the geology team
into requesting what's
known as an upper tier, which is
we're parked next to
a mountain that's taller than Mount Whitney. And so if you're driving up 395 in Southern California,
you need to crane your neck upwards to see the top of Mount Whitney, right? It's above you.
And so we sometimes need to take pictures above our normal 360 degree panorama to get Mount Sharp,
which is this mountain right next to us, and kind of the full horizon.
And so I'll say to the geology team,
hey, do you guys think you need an upper tier
to target some remote stuff in the next plan?
And they'll go, well, I mean, not really.
I'm like, look, today we've got loads of data volume.
We're not tight for time.
How about we just do a five by one upper tier?
I'll shrink it so we don't take more data than we really need.
And they're like, oh, okay then.
So, you know, sometimes I'll coerce them just a little bit.
And it depends who's on from the geology team that day.
It depends which of the engineering camera team we have on.
But I normally get my way.
I'll try and find an excuse for it.
But sometimes you go, you know what?
We're tight up against the communications power,
so we're short on power, and it's just not worth pushing for.
The other thing we started doing, which I came up with last year, actually,
is a thing called SPENDI.
It's a NASA thing.
It has to have an acronym.
That's just the rules.
I didn't come up with the acronym, but I did come up with the idea.
It's called Shunt Prevention ENV NavCam Drop-In, SPENDI.
There are situations, and it doesn't happen often but
where we actually have a full we have our rtg the radioisotope thermoelectric generator in the back
and then we have a chemical battery inside with which we drive the rover and we kind of trickle
charge that with the rtg in the back and batteries like to be between you know 40 and 80 percent charged they don't like being fully charged they don't like being flat between, you know, 40 and 80% charged.
They don't like being fully charged.
They don't like being flat.
And if you can keep them in that 40 to 80% range, they're nice and happy.
If in a given plan it looks like we're going to be fully charged
and we're going to be shunting power, we will drop in a spendy.
And spendy is like this omnibus edition of all of our regular environmental science observations.
And so we will get cloud movies, dust devil movies scattered around the whole horizon to kind of spend 20 minutes just taking pictures and smelling the roses.
And it's kind of, if we're going to stay awake just to avoid fully charging the battery, why not have a look around while we're doing it? And so the occasional extra bit of imaging to see Mount Sharp,
to see the top of the hills nearby, and then our occasional,
we've got a full battery, let's have a look around anyway.
But beyond that, it's pretty rare for us to have much
in the way of artistic discretion, unfortunately.
The cameras can't take nearly as much power
as moving the rover.
The thing that actually takes the most power
is being awake.
So our power supply, the little RTG in the back,
generates something like 90 watts or so.
So if you do the math,
we get about two and a half kilowatt hours per day of energy.
Half that is kind of lost to survival heating, background activities.
And so when we're awake, we're burning more than the RTG generates.
And so if we were to wake the rover up at nine o'clock in the morning and leave it awake all day long, by nine o'clock at night, the battery would be flat and it would brown out and die.
And so we have to
take catmaps. We have to say,
okay, the rover's going to wake up, we're going to use a robotic arm,
we'll take a brief nap,
we'll then do some science,
we'll go driving, we'll do a communications
pass. At that point, it's like 6 o'clock at night,
we'll go back to sleep again. And then the rover
will nap through the night, just waking up
for communication passes.
Data volume, you know, we can always say, say okay this is just a pretty picture it's not a very high priority leave it in the trash in the flash memory and we'll get to it when we get to it in
terms of sending the data home but most of the time it's power or time that's the killer it's
we don't have enough power to stay up longer and do cool stuff or we don't have time between
starting the commands in the morning
and that decisional communications pass
in the afternoon to fit any more stuff in.
Honestly, it's one of the most rewarding challenges
of operating a rover that's eight and a half years old
is trying to get it to do more,
trying to fit more stuff in.
What are the tricks we can use to squeeze stuff in, paralyze stuff as best we can
to make our slightly arthritic old rover as productive as it possibly can?
It's a really rewarding part of the project.
I always find it really fun to try to optimize things until you can get every little thing out of it.
Yeah, I've recently gone back and done a deep dive
into exactly how long a bunch of our common recurring observations have taken,
compared that to how long we give those when we're planning it
and where we can, tightening our belt, saying,
you know what, we've always said this takes six minutes.
It's actually more like five minutes and 40 seconds.
Let's take 20 minutes off the planned time for that
so we can fit more in or go back to sleep earlier
and get more power and stuff like that.
It's weird.
The past 12 months, a lot of my focus has been,
how can we just take some cool stuff
when the battery is fully charged,
but also how can we penny pinch down to five seconds here,
20 seconds there to also save power?
Because sometimes we're good on power,
sometimes we're not.
And I'm looking forward to when we're kind of really low on power
and we're really having to use all of the tricks with our, you know,
decrepit old RTG that's really, really not doing great.
That's actually going to be a huge amount of fun.
When does that happen?
Not this week, fortunately.
So we are 3,100 days into our mission.
I would expect us to be able to keep going at a fairly regular, good, scientifically productive rate of progress for another four years, maybe.
After that, the RTG is going to be getting a little tired.
This is the same power supply that operates the Voyager spacecraft that has been going since 1977, right?
But we don't have as much power as Voyager has ever had, actually.
We only have about 80 watts.
Between 80 and 100, when it's cold, we get a bit more.
When it's warmer, it's a bit less, stuff like that.
But about four or five years from now,
we'll have to start really tightening our belt. But I can imagine us five years from now we'll have to start really tightening our belt
but i could imagine us 10 years from now just reducing the cadence of how often we operate
the vehicle and saying okay we're just going to have three days of busy stuff per week the rest
of it's just going to be recharging the battery and we could keep that going for years and years
and years there's loads of tricks we can pull.
We can say, hey, you know what?
Waking up to do a communications pass takes a whole bunch of power.
Let's do that less often.
Let's just stay asleep, more catnaps, less stuff.
We could keep it going for a long, long time,
just a case of having the budget to keep operating it,
and as long as the budget keeps flowing,
we will bring every single thing we can out of this old river.
At that point,
it's going to be,
it'll be in there 20,
this is incredible,
20 years driving around.
Yeah, I mean,
we're on Sol 3100 opportunity
that was solar powered,
lest we forget,
operated for 5,111 days
before succumbing to a dust storm.
We could easily beat that.
I think that's a very, very achievable goal is to break that
record. You said 9am for wake up call for the rover and afternoon. Do you live on Sol days
or do you live on Earth days? So the rover does Mars time and very early on in the mission,
the engineers do Mars time. In fact, right now I have colleagues operating on Mars time that are on the Perseverance mission. But when you get into
two or three months of that, you start to kind of get a slow detachment from reality.
It's not easy. And for those who don't know, the challenge is that a Mars day is about 40 minutes
longer than an Earth day. So let's say your data
comes back from the rover at nine o'clock tomorrow morning. That's great. You can start planning at
nine o'clock in the morning. Well, tomorrow that's 9.40, then it's 10.20, then it's 11 o'clock,
then it's 11.40. Two weeks from now, that's now nine o'clock at night. Another two weeks from now,
it's back to nine o'clock in the morning and you've lost an entire day. And you can keep that
up for a while, but people have things like families and
obligations and their sanity to maintain and so after three months you really can't do it so
what we do is with curiosity we operate the rover typically three days a week we plan on mondays
wednesdays and fridays but if curiosity zone mars time because bearing in mind they're scattered
around the planet so you're now talking about multiple Martian time zones,
if
basically, if the Martian
night shift for Curiosity
lines up with Pacific office hours,
then we'll also sneak in an extra
plan on a Tuesday. So we'll do Monday, Tuesday,
Wednesday, Friday.
Fridays, we always end up giving the rover three days
of stuff to do, so we can go home and have a weekend.
And so about half the weeks, it's Monday, Tuesday, Wednesday, Friday, and the rest is just Monday, Wednesday, Friday.
And so that means you can only drive three times in a given week when you're doing that restricted planning.
But kind of managing those schedules of when is the data going to come home?
When do we need to be ready to send commands?
There's a whole task in and of itself.
And it's probably one of the biggest challenges early on in a mission when you're
trying to manage the humans as well. It can be really, really tricky to get that Mars time and
Earth time to play nice with each other. And once they do, they just go out of sync again.
Yeah. The way to think of it is you're traveling about one time zone west every single day.
It's a constant state of moderate jet lag.
We've joked what we need is a very, very fast cruise ship
that can circle the planet once every cycle
so that we're always operating during daylight hours.
And so we'd have the Mars Ops cruise ship
constantly circulating the planet.
The science team could join us and we could, you know, live a reasonable time zones, but operate the rover every single day.
Then we discovered, actually, there isn't a cruise ship in the world fast enough to get it done.
So we gave up.
It was a nice idea.
I have a couple of listener questions I want to get to, although I think we've gotten most of them.
Kevin asks, how do you validate the hardware for mars while on earth how high fidelity are the tests so there
are kind of three kinds of testing actually i have to exclude four kinds of testing what i do one is
the shake rattle and roll of surviving launch and landing and for that we literally bolt it to a
shake table that can shake the heck out of it
and sweeps through a whole bunch of amplitudes
and frequencies that replicate
what it's going to go through through launch.
And you do that at the box level.
You do that at the vehicle level.
Then we have something called ThermalVac.
We have a 25-foot-wide space simulator
up at the back of JPL.
And it's a big vacuum chamber. And so we can suck all the
air out and then we can flow liquid nitrogen up and down the walls to make the walls feel cold
like space. But then we have a huge bank of arc lamps that reflect through a mirror at the top of
the space simulator to behave like the sun. And we can turn that up or down to make a spacecraft
think it's orbiting Venus or has landed on Mars or is
orbiting Jupiter, anything in between kind of thing. And so we suck all the air out, put a
little bit of carbon dioxide back in again, and then turn the sun down to Martian levels, and we
can put it through day-night cycles in our space simulator. We also have electromagnetic
compatibility testing. So we will literally, in an RF chamber,
we will turn all the various bits of the spacecraft on
and measure if they're impacting other bits of the spacecraft.
And then there are mechanical things we can do, like drop tests,
things like testing the landing system, testing the wheels,
driving over rocks and things like that.
So you never get to test the whole thing in an environment that's
exactly like Mars, but you can test bits of it in ways that are as close as you can. And then you
just have to tie all those tests together into kind of one amorphous set of V and V tests that
say, you know what, we think this will work when it gets to Mars. That's a little scary. I really
want it to all work. I mean, if you think about our crazy landing system,
it would basically be impossible.
It's nuts.
It would basically be impossible
to fully run that on Earth.
Earth has too much air, too much gravity.
The first time the whole sky crane process
was done end-to-end
was August 2012,
when it successfully landed Curiosity on Mars.
Did you really think that was going to work?
Jeez.
So I spent nine-ish months as the technical director
for the landing animation of Curiosity,
long before I got into mission operations.
And I was looking at some of the engineering simulations going,
yeah, you've done a real good job to convince people this is going to work.
But you're like, no, no way.
But then you sit down with these people and you hear them walk through why it makes sense
and the measures they've gone to.
And you're like, you know what, you crazy fools, you might just have got the right idea here.
And let's be honest, it worked twice now, right?
It worked with Curiosity and they even upgraded it.
And it worked for Perseverance as well.
Yeah, The Right Kind of Crazy is a book written by Adam Stelzner, who was the landing team lead for Curiosity.
And he's the perfect name for a book, The Right Kind of Crazy.
Let's see, from Jakey Poo.
Do you have to worry about water ingress at all for Martian electronics?
Jakey Poo, good news.
Mars is incredibly dry um if you took
all of the water that's in our atmosphere and kind of froze it out you get a layer that's just a
couple of microns thick um mars is incredibly cold incredibly dry and so things like water ingress
um rust stuff like that is not really something we have to worry about.
Dust ingress definitely is.
But fortunately, water isn't.
I wanted to ask, I'm thinking about all the times that I've gotten devices back that were supposed to be hardened for outdoor use and gotten them back and had them be half full of water.
Or spiders.
Or spiders, yes, that happens.
Um,
we don't have the spider.
Well,
we,
you don't know.
I wanted to ask him,
have you ever gotten one back to make sure the dust ingress works?
And like my brain went,
you can't ask that.
That's not it.
It's no,
he hasn't ever seen that.
We,
this is the thing is that,
I mean,
I got to see curiosity in the clean room before,
before it launched.
And I got to see perseverance as well. Um before it launched and I got to see Perseverance as well.
I operated the cameras
on the Opportunity rover. I have never seen Opportunity
with my own eyes. Never seen it.
And that
was part of the motivation when
back on Sol 5000, which was
2018 or so, with
2017 even, with Opportunity,
we decided for her
5000th Sol anniversary to take a selfie
now curiosity has taken amazing color high definition selfies opportunity has a one
megapixel black and white microscope that cannot focus right it's designed to literally be
positioned six centimeters or so above a rock and that's where it focuses that's what you do if you
want to focus it you move it that's how it works but we thought you know and that's where it focuses. If you want to focus it, you move it. That's how it works. But we thought, you know what, let's give it a try.
And we shrunk the images because we knew they'd be blurring out of focus,
but we did get to see our rover for the first time in, at that point, 13 years or so.
And there were actually some, it was a little dusty in the room
when we got to see those pictures the first time, that's for sure.
But one thing that you have to think about with any any spacecraft think about you've got um you've got
a camera with a lens you've got electronics boxes you've got the size of the curiosity rover is the
size of a mini cooper it's a beast of a thing but think about that sat on the launch pad a couple
of minutes from now it's going to be in a vacuum that is stronger and harder than any vacuum chamber
on earth because that's what space is like So every single molecule of air that is inside any part of
that spacecraft has about three minutes to get out. It's got to get out from the rover, into the
nose cone, from the nose cone out into the atmosphere as we head up into the vacuum of space.
And so you have to have means of the air to get out everywhere. If you've got a way
for the air to get out, you've got a way for the dust to get back in again. And so what you end up
doing is making these quite convoluted little channels for the air to get out and hope that
dust doesn't find its way back again. Dust did eventually find its way back inside the dust cover
for the Opportunity Micros microscope um but not an awful
lot but it's a weird thing to think about like this these spacecrafts start full of air and by
the time they're in space it's all got out somewhere so you have to design a way for all
that air to get out in about three minutes never would have thought of that and that's why your
rover exploded on launch.
Not on launch, once it was in space.
It arrived in space five times its natural size. And one more question from Andre from the Great White North.
Is dust very abrasive, the dust on Mars?
So Mars has had nothing to do for several billion years apart from turn rocks into ever so
slightly smaller rocks and it's really good at it. So Mars dirt is it's it's easy to think of it as
like like abrasive sand but there are sandy places on Mars but the dust that's in the atmosphere and
blowing around all over the place is way way finer than Way, way finer. It's more like the consistency of corn flour or talc or something like that.
It's incredibly fine.
Even down to the cigarette smoke levels are fine.
It's incredibly fine dust.
And so it doesn't tend to be abrasive.
It can get stuck in places.
It can coat things.
But it doesn't tend to really erode things.
The moon, on the other hand...
Oh, right, right, okay.
Its sand grains, its dust grains are very abrasive
because they've been made through impacts and cratering.
They have been made through rocks getting smashed into ever smaller rocks,
not wind eroding them into ever smaller rocks and
rolling them around and so if you compare kind of grains of what you think mars dirt might be
looked like they're round they're maybe a little jagged here and there but generally speaking it's
like a very fine powdery dust moon dirt is like someone's going to design the most horrific
abrasive you can possibly imagine and so in the space of like three days of walking around with a geology hammer, one of the astronauts wore the rubber grip off a geology hammer just through dust.
Because that moon dust is horrific.
It wants to rip everything to shreds.
Mars is a holiday compared to dealing with moon dirt.
If you were stuck on Mars and had some potatoes,
could you in fact grow things, potatoes and water,
could you grow things or is it too sandy?
So, I mean, you could hypothetically do some sort of if you had like a
little tent thing you know go full watney and have a little tent um you could do something
hydroponically for sure um a lot of the dirt has dissolved compounds in it that are not great
and we've seen perchlorates around a lot that are kind of like a bit of a bleach. They're a chlorine compound that's not particularly nice for life.
So just if you just, you know, erected a tent over a patch of miles and stuck some potatoes in the ground, they're not going to have a nice day.
But if you were to take some of the sand, maybe wash it out a bit, right, with some fresh, free-range, organic, melted Martian ice from the polar regions,
and then put some nutrients back in that you brought with you,
or that you had made, Mark Watney style,
you could conceivably kind of do the Watney thing.
But you would need to essentially decontaminate that soil
of the really nasty stuff before you got started.
All right.
Well, I have so many more questions, but we are about out of time.
Doug, do you have any thoughts you'd like to leave us with?
The one thing I think is worth saying is that we're incredibly lucky to be involved in an
amazing mission like Curiosity.
I know my friends on Perseverance feel exactly the same.
It is a privilege to do this on behalf of the country and indeed the planet.
You know, we are the forefront of the human experiences, trundling around on Mars,
learning about it so that one day we can send humans to go and do science there as well.
But everyone can come along on that journey with us. Every single picture we ever take goes online
the second it reaches the ground. And so you can be there right alongside us every shift,
every day. There are times when just because of how Mars time and Earth time have lined up,
someone following along from Brazil or Australia or the UK could be seeing the pictures before we
do, just because that's the way the time zones have fallen. This adventure is for everyone to
be a part of.
And we see people doing that.
We see people taking the pictures and making mosaics and movies and maps and animations.
And it is wonderful to be able to plan stuff with the rover,
knowing the public are going to be along with you for the ride,
and they're going to be enjoying this stuff as well.
This adventure is not just for us.
It's for absolutely everyone to come along and enjoy.
Well, thank you.
Thank you for the work you do
and thank you for being with us.
It's an absolute pleasure.
It's always fun to talk about
the fun stuff we get up to
with curiosity.
Our guest has been Doug Ellison,
Mars Science Laboratory
Engineering Camera Team Lead
at NASA's Jet Propulsion Laboratory.
Thanks, Doug.
A pleasure.
Thank you to Christopher for producing and co-hosting.
Thank you to Twinkle Twinkie for pointing me in the direction of Doug and to our Patreon
listener Slack group for many questions.
And of course, thank you for listening.
You can always contact us at show at embedded.fm or hit the contact link on embedded.fm. And now a quote to leave you with
from a very, very good set of books. Mary Robinette Cole wrote The Lady Astronaut of Mars.
It's a hard thing to look at something you want and to know the right choice is to turn it down.