Frame & Reference Podcast - 201: ARRI Product Manager Chase Hagen
Episode Date: July 31, 2025In part 1 of 2 "ARRI" episodes, we have Product Manager Chase Hagen on to talk about the camera side of things at ARRI (and a whole lot more)!Enjoy!► F&...;R Online ► Support F&R► Watch on YouTube Produced by Kenny McMillan► Website ► Instagram
Transcript
Discussion (0)
Hello, and welcome to this episode 201 of Frame and Reference.
You're about to drop into a conversation between me, Kenny McMillan, and my guest, Chase Hagen, the RA product manager of camera systems.
Enjoy.
Yeah, I've done the same setup for five years.
This is literally the first time I've ever not been recording in my office.
Nice.
Yeah, we should.
I don't want to touch a week.
I will say this guy, this is what I learned on.
Fantastic, yeah.
Yeah.
The 16 there.
And this one is flawless.
It looks really pristine.
It's so clean.
It was a very popular camera.
Yeah, we have one of the thing in our office.
And obviously they have a little museum when you walk into Munich as well.
It got a nice little museum in the headquarters, as we call it, the Arial.
Yeah.
It's fun to see the whole evolution from the turret system to a modern camera, you know.
And the turrets do have their advantages before Zoom lenses.
Don't have to worry about changing lenses back when lenses were that small.
tiny is easy to run around with them
a bunch of primes on the front of your camera
kind of thing. Well, and
so the reason that I'm
here at the ASC
technically, like I said, in the R.E. Education Center
at the ASC. Right. Because
I was filming a
thing for Adam Savage's
channel tested.
Okay. I think I know it.
So my friend Joey
was hosting this
episode, I guess. And
Steve Gainer, the
curator of the ASC museum took us around all of the cameras from like literally the very
first Lumiere all the way up till the more you know everything from already Panavision
Bell and Howl all that you know all of them so I yeah in a weird way I was
accidentally preparing for this interview in a way it was like a free history tour
basically yeah exactly it is totally prep yeah yeah yeah
I think that one's an ST, technically the one behind you.
I forget if it's an S or an ST, but there's a slight difference between the two, but it's in the family of the S.
Yeah.
So when did you join the company?
So I joined Erie on August 1st, 2016, so it's been a little over.
It's basically coming up on nine years this year on August 1st, so a little while.
I did do work with them as a freelance consultant.
So I shot some of the early Airy PCA video.
which were like the map box, the tilting filter, all that kind of stuff,
the eyepiece leveling back.
They're still on the airy channel all these years later,
but that was kind of one of my gigs when I got out of home school.
Gotcha.
And so you studied cinematography then?
Yeah, yeah.
I studied cinematography.
I went to Art Center College of Design in Pasadena.
So that was a fun experience.
I did four years of that for my BFA,
and we had tracks in the school that were for directing, editing,
and cinematography for the undergrades.
at least we had those specific tracks.
So I did cinematography track pretty much from right on
because usually by the end of the first year you had to declare your track
and then you would do the next three years with more specific classes.
You could take other classes.
I took a lot of classes with advertising kids,
which was super fun back then.
But yeah, cinematography was my focus.
I thought at first I could do some directing.
But, you know, talking with actors is really tricky when your direction is stand here.
You're right here in the light.
Don't move.
Don't do, you know, you end up just basically telling them,
what looks great, not how they should be feeling and acting.
So that's not very fair sometimes.
Well, and I feel like every film student doesn't know any of the jobs that there are on a film set.
So they just show up and they're like, I guess I want to direct.
I want to make the movie.
Yeah.
Yeah, I want to be in charge.
Exactly.
Surely I want to be in charge of everything.
And then it's like, there's a lot to be in charge of, right?
Actor appreciates, actor appreciates, or appreciates some time.
You need to give your time to make up.
It's all dividing your time and attention across all the department was a lot.
For sure.
Yeah.
How do you feel about being held accountable for everything?
Oh, I hate it.
Well.
Yeah, exactly.
Sorry.
Exactly.
You're spending their money.
So, yeah, then you have the boss or the overlords, if you will, the producers and everybody that say, hey, you're spending my money.
I would like a little say of that.
So, yeah, it's interesting.
I'll stick to the photographic side of the business per se, lights and camera for sure.
Yeah.
So you said you went straight from college to work in the Nari?
Almost. I didn't even get to two years. I keep saying always a few years, but I think I technically only made it about 18 months as a freelance cinematographer in LA after film school. I graduated in April of 2015 and I started an area in August 2016, so not even quite 18 months. So you could say I was, yeah, basically, I mean, I didn't do the old style area, which was, you know, you were apprentice back in the 80s and 90s. You would sit there and, you know, if you wanted to work at area, you'd start the machine shop, right? And you'd learn how to make out your personal.
cubes and your perfect spheres and your perfect cones and bits and pieces with the milling
machine and then one day you work your way up to building more parts of the camera but in my case
a little different thief day but very shortly after school yeah because so you're saying that
when they would bring new people on they would they were like hey there's a good chance you're
going to be manufacturing yeah so we were apprenticed it was very common at the time back in the day
to have apprenticeships right so kind of like metal working you know
because obviously back in the day, I mean, I'm talking pre-digital era, working at area there in Munich.
I mean, obviously we're a much smaller company.
They usually started out in, you know, metal and machining for the most part because that was one of the primary, you know, jobs in manufacturing, building cameras.
I mean, don't forget back then, we didn't build the image capture.
You know, we built the mechanical, you know, pieces of the camera, of course.
So Kodak and Fuji did all the imaging and, you know, we did the precision machine.
And so that was very common for a lot of people at area to start on the manufacturing side
and heavy engineering side.
I mean, you know, back then there wasn't a lot of marketing jobs or things like that.
It was kind of, you know, right in it and, you know, learn how to build, you know,
these intricate machine.
Well, and that was something that was so interesting about going around the ASC here
and looking at all the motion on all these cameras.
Like it's interesting how little has changed since like 19.
But also one thing that's right when I was like these were not CNC made. These were people
handmaking all the motion on all these cameras. And it's shocking how precise they all were.
Yep. Exactly. Even, you know, just using calipers and I don't know. Yeah, simple tools. Yeah, exactly.
Literally manual measurement tools. You know, and I'm sure like when we think about it, like at the end of an eight day, our day, you're tired. Right. So like how do you guarantee this consistency?
quality from when you've got your coffee in the morning, you're starting at 9 a.m.
And then, oh, you have lunch.
And then back then they used to serve, I mean, way until recently, they used to be able
to have beer in, you know, the cafeteria for lunch.
This was a common thing.
And obviously, I wouldn't, I wouldn't have too many beers and then start making the camera.
But, you know, once you've had the caffeine, you need something to roll it off.
Yeah, exactly.
Exactly. Roll it off, you know.
So, yeah, it's really interesting all basically, like you said, essentially bespoke or handmade
for a good period of time.
And then obviously some automation comes in at a later point.
But yeah, for the majority of it, it's it's fully manual, manual tools, manual, everything.
Right.
So for, I don't know if you've heard this podcast, but there's probably going to be a lot of jumping around.
That's fine.
Very interesting.
I might make a stuff.
But the first thing I wanted to start with very inconsequential was obviously I've been saying Ari,
or I've been saying Ari, you've been saying Erie, long time, long time.
people argue about this.
Oh, yes.
Totally.
Totally.
It's a...
As you work...
Yes.
And so my...
Right there.
Right there.
That's Arnold and Richter.
Right there.
Yep.
Okay.
Arnold and Richter spells R.E.
Yes.
That's my argument.
That's your argument.
It's very fair.
And it's exactly...
I'm not going to put on a German accent.
That's basically it.
That's I was just going to say.
They're both...
technically right, you know. It was just that at the time, you know, I guess an accent thing, for
lack of a better term, Airy, you know, for Arnold and Richter, you know, Airy kind of sounds right
when you're saying Arnold and Richter. But when you're German, it's Arnold and Richter, you know,
excuse my poor but somewhat okay German accent at this point, given the time I spend here,
Arnicta, Ari, you know, that's the way to say it, you know, with the German accent. So
both are absolutely correct um they just you know two different ways of saying the same thing i've had
camera assistance by like first month at airy when i was in the office i had a call in the call from set
it's a camera assistant okay sure yeah let's see what we can do to to fix some of their issues and
all that and it was that exact question they were on set must have been a slow day or they were
waiting for the lights to move and they were like is it air here you know kind of thing so i said okay
all right let me settle this and i i you know repeated this exact story and i yes they were
Or, you know, I don't know who won the bet at the end, but they were satisfied.
But I've even gotten some calls, so yeah, there's not a uncommon question.
Well, the thing that I kind of wanted to touch on it do mostly was, you know, like I said,
been going through looking at the history of cinema cameras.
And I've said for the longest time that cinema is a relatively young art form.
Like whenever people talk about like how it should be done, I'm like, guys, like this isn't painting.
You know, like there's any, if you look at painting, it's all, that's all, you know, trial and error.
And then eventually it becomes a period, right?
You know, entire like, you know, the era.
But we haven't had enough time, I feel like in film for there to be eras.
But the one major delineating thing, besides he was showing us the, where you had to look in the tube and then move the entire camera to one side so you can actually go, you know.
Yeah, switch over.
Yeah, obviously hand cranked to motors, but the big one, obviously, the delineation being filmed a digital.
And that is so much more recent that I feel like I'm 35, I watched it happen.
And in a weird way, I feel as, you know, I write for Pro Video Coalition, this podcast is educational adjacent.
Like, I do feel kind of a steward of that history to remind people who are always like, oh, I need to buy the newest thing.
I'm like, guys, it's like anything after 2012 almost year.
Great. But talk to me about the start of the R.E. Digital platform, because was it the D-21?
Was it the 20 before it? Yes, D-21 was the upgraded version. And a lot of it comes from, you know, this understanding that the reason that cinematographers chose film wasn't, you know, because it was necessarily easy, especially at that time. Obviously, there was many video cameras. You know, the F900 had been out and shot Star Wars. So there was this whole.
push to move to digital. It's so much easier. It's a lot less trouble. It's, you know, in some
ways cheaper, faster, more efficient. But one thing that was really talked about during that
transition was the quality, right? Did it deliver the same quality as the film stocks at the time,
which obviously film had an evolution to if you look at film stocks from the 80s to the 90s to
the 2000s, there's an evolution in low light performance. There's an evolution in color
rendition, you know, and that's why certain movies of certain areas look a certain way. So, you know,
one thing that when we talk, we're a cinematographer's best friend in a way that we make our
company, we don't make video, you know, at the time, especially to make video cameras or video
products in that same way. So our first, you know, instinct was to talk to our customers, which are
cinematographers. And we said, well, you know, what do you think about this, this digital revolution,
in this video revolution.
And, you know, there was obviously some that were more willing to adopt those changes
and live with some of the compromise.
And many said, look, the quality is not there.
I don't feel like it's the full paintbrush, right?
I can't paint with it like I can with a film.
There's, you know, advantages and disadvantages to any format, but I don't feel like
it, you know, fully fulfills my means as an image maker.
And so when we started, you know, with our first digital products, they were the airy
laser and the airy scanner.
So for those of you don't know, the laser is essentially a laser system of archiving a digital image.
Let's say, you know, could be a digital camera image or just an edited final piece and archiving it via RGB lasers back to a film negative, you know, essentially to store it so that it could last the next hundred years.
Because film is designed to, you know, withstand with proper, you know, I guess you could say temperature conditions and humidity.
Exactly, exactly, right?
Yeah, right. First one's not so good. But yeah, by the last 50 years, they were pretty stable. Exactly. So that was the laser. Then the scanner, obviously, we have a scanner that basically takes a film negative. And then you basically are, you know, playing it back or rolling it through two winders. And then you're capturing an image. And the difference between if you're wondering what the ARI scan did and what a film scanner is versus a telecini machine is. But before that, they had line scanners, kind of like,
your scanner at home or you scan a document and it goes line by line by line very, very
slowly, not very fast, obviously. Image quality, not the best, but it was going to SD
video, right? It was going to analog SD video. So the quality of the film was always much higher
than what we were outputting. So that was acceptable. But obviously in the early 2000s,
we see HD coming around in terms of the output format. And so the area scan offered this very high
quality film scanning of 16 and 35 millimeter stocks and at a calibrated light source so that you
could have even elimination. The optics were very good. The colorometry was good. And so that was the
first digital product that the airy scan obviously way before my times. And that's how we learn to make
digital images, right? Because we scan all these different kinds of film stocks and we started to learn,
oh, this is what a Fuji stock looks like. This is what an old Fuji stock looks like. This is what a new Kodak
stock looks like. This is what a black and white ag stock looks like. So you're starting to build
this kind of knowledge and repertoire and library of something that really was normally outside
our purview, right? You know, we were just building the camera and somebody else made the
machine and it scanned it line by line and it ended up wherever it was going to end up, you know.
And so that was a very interesting transition. And so that all happened. And then, you know,
of course, a couple of years of that. And then we realized, well, what we're doing in a scanner
they're essentially taking a digital sensor at the time as a monochrome sensor because the way that it was
illuminated with RGB light, that's what created the color. The actual sensor itself was just black.
Exactly, exactly. The sensor itself had no color filter array or what's called a CFA on a sensor.
It was a very good sensor essentially. I said, wow, it's got a lot of dynamic range. It's capturing these images.
So let's, you know, evolve. So when it comes out of the scanner, that's when we start worrying about,
okay, we're going to put, you know, color on it, like a Bayer pattern sensor, CFA, color filter,
array. We'll start capturing images with it. And that's what became the D20. Then over time,
over four years, we kind of evolved that. But one thing I think that's really key to mention is
that the D20 and the D21 were never really sold as products because we were not convinced
as airy that this was good enough to replace film, right? This was a step in the right direction.
It's a bit of a test bed, a trial run where we had every rental, basically our rental division
of ARI, have the D20 and the D21 for customers to try to use, to gather feedback.
But we said, you know, we don't want you to pay actual, you know, money to buy it.
We want you to pay to rent it and try it.
But we want to use that as a test spread.
So I think that was something that we did very differently, right?
We didn't just slap a label on it and say, here it is.
It's done.
And then everyone said, oh, this isn't very finished.
and try to, you know, figure out what the shortcomings were.
So D-21 came a few years later.
D-21 was 2008.
The D-20 was 2004.
So 2004 was the beginning of the Super 35 digital era
with other cameras from other brands coming around
where we were getting to finally a sensor size
that was acceptable to most cinematographers to the length of that they were.
Right.
Because the F-9-Lu-Lu-3, one-half, something like that?
Two-third.
Two-third.
Two-third, two-third, three sensors with a prison blind.
Right.
It was basically a broadcaster, yeah, exactly, a podcast-style prism or three-chip camera.
It's a very common term for that.
And I also heard, I think the DP mentioned this, but someone, that that first Star Wars,
maybe was the second, I think it was the first one.
Either way, whichever one they used the F-900 on first, which I think might have been two,
it was so loud.
They had it umbilical to like an entire server rack.
And the fans were so loud that the entire movie's ADR.
There you go.
Right.
I could totally believe it back then in terms of processing power, the amount of cooling.
And that's the thing.
It's like, you know, I always say like early digital was like two steps forward,
but then three steps backwards, right?
Because at this point, the digital camera is now louder than the film camera, which is ironic.
And it's got a cable that you can't take it to.
And yes, okay, you don't see the picture right away and you can't play it back.
But, you know, you know, if you're confident in film, you know, it's capturing
what you wanted to capture, but this whole mystique of not being able to see the only person,
I mean, in its own camera that could see the image was the operator.
All you saw with the video in Video Village was the little tiny monitor tap recording an SD
video that was really nothing representative of what the camera was capturing.
So we traded in the early days, it was a trade of, you know, I guess you could say, I want
to know, maybe go as far as convenience, but in a way it kind of was versus, well, image quality,
right we know that the film is going to be great but we're going to have to wait two or three days
and process it and then we're going to be able to project it and we're going to look at it
we're going to say oh that's what we shot yesterday found a thing you know or two three days later yeah
yeah well and the other thing that i think in that kind of wild west era you know let's call
it 2000 to 2008 because when was the red one 2009 2000 i think 2008 was the first year they
announced it in 2007 but only
it shipped until 2008 and then they
quickly upgraded it in 2009
with the next generation sensor
I think it was that first year of the sensor
and then they had a second sensor by 2009
right and they've got one here
I used it in college but there's like
number two I think is over here
there you go yeah yeah London
but the camera's called London
but I remember using that thing you had to
go get we were so fast oh my god
4K oh super 35 oh it's
it's small, you know, now it's pretty big, but at the time it was small.
And then it's like you had to have a dedicated PA to go get a bag of frozen peas every five minutes to go lay on top of the damn thing.
But sure.
Even prior to that, I remember all the, when the Alexa came out in 2012, 2010.
2010.
I remember all these D.P. switching over.
And one of the biggest things they had mentioned was that it reminded them of film, not visually, but just the photo lens.
on it, Bush, go.
Because all these other cameras were computers.
Yeah.
Which we're still dealing with, like, a lot of different camera manufacturers have made
it almost too complicated, whereas, like, the Alexa is still like, more or less
lens battery record.
Yeah.
And that's something.
Yeah, exactly.
A lot of people appreciate it, even to this day.
Absolutely.
And do copy.
You know, we see this, I, we call it our six, you know, zone home screen.
All of a sudden ends up on every other camera.
And it's like, oh, I recognize the six buttons in a jog wheel.
They're quite, quite old.
But that also came from understanding the customer.
I think that was one thing that we were able to fall back on with our legacy is we knew our
customer very well.
And they were film camera customers.
And if you looked at a film camera, got very few buttons and no menus, none of this kind of
stuff.
And the idea was to try to keep it just as simple as possible so that those people felt comfortable
adopting a new tool, right?
You know, it's like, how do you convince an artist, whether it's music or painting to use a new tool that's outside their comfort zone?
You don't say, oh, well, you'll have to relearn everything and then it'll be better.
Like, no, I'm not really, I'm not in that mode most of the time or I really have to be, you know, there is, we all know, on set, there is no, you know, time to just kind of learn most of the time.
It's really, you've got to get, make your days.
You know, you've got big actors or actresses, especially who time is precious.
And so the idea was to really keep it simple.
I mean, the Alexa menu, the classic, didn't scroll past the first screen.
It was literally the main home screen and then you went in the menu.
There was no other options to, you could scroll down to the options, but there was no more
underneath that.
That's how simple the camera was.
And the idea was to keep it very, very simple.
He was easy to read text.
You know, my favorite thing actually on that topic was, you know, I'll do some of these
demos for like, you know, film camps where, you're.
It's like fourth grade to eighth grade, and they'll go, oh, my God, and Alexa, this is so, you know, crazy and intimidating.
And I'll go, do you know how to use your iPhone?
Oh, yeah, yeah, yeah, yeah.
I said, this is going to be just as easy.
Once you understand the concept, this is going to be just as easy because what you see is what you get.
You know, that's kind of a term that you see here, a lot in graphic design or GUI design.
What you see is what you get.
There's no buried submenus.
There's no, I got to push this to get to that.
There's no pictograms, right?
you know, my favorite thing on other cameras is trying to guess, okay, well, what does that
word mean or what does that pictogram mean? Is that the one I think it is? Is it under this
other setting? I'm just using very simple wording, right? Like, recording is recording. Monitoring is
monitoring. It's no, like, fancy, like, you know, terminology that doesn't relate to what it actually
is. My favorite is Sony's, uh, yeah. What are we broadcast? Like, come on. Yeah, yeah,
paint, exactly, exactly. You have to know your customer. And I guess to them,
They know their broadcast customer very well, so they're, well, we must copy this, you know,
kind of thing, or keep it consistent. So I think it's really, we had the advantage of understanding
and not rushing into it. I mean, so D20, 2004, D21, 2008. D21 is getting better. Most of the
improvements in the D21 were minor, though. They were, you know, image science related. Like for the D20
and the D21, to see the menu, you had to hook up a second monitor. There was no display on the camera.
So you had to hook up a second monitor, but all it did was the menu wasn't actually like the monitor path or anything of the camera. That was a separate connector. So we learned very quickly, okay, this is not acceptable. We need to have control on the camera. It needs to be simple. What you see is what you get. Because on a film camera, they had little LCDs and the later ones. And it just told you the basic data and told you from a glance, your shutter, your frame rate, voltage, certain, you know, basic statistics. And so we kind of designed that home screen to be the glance of going,
okay, I got it. We're good. Let's keep going kind of thing without having to study it and go,
what does that symbol mean? Oh, my God. It's okay. What does it mean? You know, so I think
there was really an amazing team, you know, there was, I don't think there really was,
an amazing team that put the Alexa together in terms of the way that they thought of the customer
and the way that they really understood what they could do to push the boundaries without
making it hard to use. You know, that's always a thing, right? Well, it's really technically the
best. But if it's so hard to use, you really start to, you know, get people in the weeds and you
just slow them down, right? It's another thing. Efficiency on set. We work in a time critical,
you know, environment where people are not really there to just, you know, okay, let's tinker
with it. You know, that's not so much our business for the most part. Yeah. Well, and, you know,
I would say, you know, like I said, these days, any camera can get you a great image, especially with
how available coloring tools are, you know, resolve this way.
And so if you've got a good, robust, quote unquote, negative,
you can make anything happen.
But there does seem to be a, and especially with the 35 coming out,
a tendency for people to want to shoot your guys as sensors.
So I was wondering if you could, A, well, we'll end up at the end,
we'll end up with highlight roll off because what the fuck.
But to begin, like what you.
You mentioned the scanning, you know, getting data from scanning.
Like my friend Eigel who shot the holdovers.
Oh, okay, yeah.
His colorist is the guy who does all the, I shouldn't say all, but does a restoration for criterion.
So it's kind of the same thing where they shot digital on holdovers because film is that these days is like too clear.
They would have had to muddy it up anyway.
So they're like, you know, just shoot Alexa and that.
Yeah.
And then the guy who does the criterion restoration just undid whatever he normally does to get that out, you know, along with lighting and yada yada, but that was post-processed.
But so you did that with the, you know, the D20, D-21, Alexa sensors.
But kind of what sets you guys apart image-wise?
Because I feel like there's only a handful of companies in the world that actually make sensors.
I'm sure you won't tell me who makes yours unless you guys do.
No, we have a partner.
Ours is made by our partner on Semi.
It's actually public.
Oh, okay.
You know, shared with us to, it wasn't public until recently, but on Semi makes the actual
sensor for us.
We designed it, but they are our manufacturer, correct?
Because we don't have a billion dollar silicon waiver fab cranking out to
Yeah, exactly.
That's what I always tell people.
I'm like, you know, if you have a camera, Sony probably made this.
It's like, you guys then Sony and TANN.
Ken and me, they do make their own sensors, yeah.
They used to be more.
Panasonic used to make sensors.
They sold their sensor business to a company called Tower Jazz,
but they used to make their own sensors and they got out of it in 2020.
Because sensors, they require basically a chip facility like you would make, you know,
semiconductors for computers and processors.
And when you're making, you know, hundreds of thousands for iPhones, pay off the fab, no problem.
When you're only making image sensors out of that fab and you don't sell enough cameras,
use the image sensor, then everyone starts asking,
we have this billion dollar building that's not making enough to pay for the electrical costs
and all that stuff, you know?
Well, that's why I was telling everyone who'd listen like, hey, Nikon buying red is the best
thing for both companies right now because it's going to position Nikon in a better position,
but also they can put those chips on the same order that they're putting all their regular
chips on and actually scale out, you know, because, you know, maybe that is, well, first,
Let's start with the sensor technology.
So you guys were looking at film.
Did you model the sensor?
Because I guess the Alev 3 hasn't changed.
So have you, then you get the four.
But did you model it after like a specific film stock or like characteristics of one or the other?
Like how did that final color look come about in like what was the ingredients there?
So there's like two steps to the process, essentially.
The first step was designing a sensor that could capture a dynamic range similar to film.
So sensors, generally digital sensors, especially back in the day,
were good at low light, you know, or better than film, I should say, in low light.
They had more latitude and the ability to push into the low light,
but they all had very poor highlight rendition.
And so when we talk to any cinematographer, they said,
until you can get those highlights right or as close to right as possible,
I will only shoot film because it just, you have more dynamic range in the highlights.
And even when you do clip it, you have a much graceful or, you know, kind of roll off.
But again, it was, you know, we're talking, you know, seven plus stops of highlight latitude more actually in film.
So we were looking at trying to hit, okay, how can we get to seven at least, you know, stops of highlight latitude?
And there's no sensor really that doesn't use the architecture that we do that can get more than a real six stops of overexposure.
Am I real?
I mean, go test it and see if you can hit more than six stops before you have full clipping.
And the reason why we can go over six stops highlight and obviously also deep into the shadows is we do something called a dual game readout, or we call it dual gain architecture.
So what we design is that every photo site on the sensor, that's basically what the pixel is when it's on the sensor.
It's called a photocyte.
Every photo site on the sensor has a simultaneous, same time, readout in high gain and in low gain.
So it's like two pipes coming out of the photo site.
And the high gain is for low, you know, low light performance and shadows because it's a higher gain.
And then the low gain is to hold all that information up into the highlights.
And so by reading out every photo site at the same time at high and low, you're able to widen that dynamic range by
quite a lot. Not quite double, but it's a lot. And then you get to pick where you want to kind of mold
those two gains together. That's the hard part, right? When you're building a dual gain sensor,
you have to understand where the middle is going to fall and how you're going to merge them. So essentially
what we're doing on our sensor is we're reading every photo site at the same time because we often
get that question. This one always comes up. They go, well, why can't Erie make a dual native ISO sensor?
We hear this a lot. And we go, we've been already a game since 2010. It is. But we're not using it for two
different values one at a time, we're actually taking it and we have the horsepower of the
back end to read them both out at the same time and then merge them into one in our competitors
saying, here's one for low light, where there's one for highlight, and you need to use these two
otherwise anything in between is a little bit muddy. Whereas what we're doing essentially is we're saying
read out every photo site to get to that, that 14 and a half stuff. And that was our goal. When
Alexa LF3, first came in a habit was the Alexa Classic, had that 14 and a half
stops of dynamic range. We showed the content, right? We showed the pictures. It wasn't the
specs. It wasn't the K's. It wasn't that. We shot this beautiful piece of a classic violinist
backlight, beautiful lighting, you know, kind of thing. High contrast. And every cinematographer
in the theater at the moment said, wow, that's it. That looks like don't. And that's where we
had, we knew we got it right now. That's just, of course, the dynamic rate. Now, now comes the
color. Okay, so how did we get the color? I want to pause you just for a moment because there's something
that I think
I remember vividly
before cinema cameras
were made more affordable
that was the thing
that everyone was tied to
was dynamic range
I you know we had the DVX 100
XL2 whatever
I got an AF 100 at 1.8
which was the GH2
yeah yeah yeah that was
24P was a big first one
right got to get the friendly right
no one had 24P
so that was a big deal
and then, you know, recording to a normal media like SD, that was the next one.
And then it was everyone just immediately turned to, well, can I see anything now?
Like, why do I, why do I have to light everything so aggressively to put it into three stops?
Yep, exactly.
So that makes sense that you would tackle that first.
So then the color, you're getting all this information from scanners and stuff.
Go ahead.
Yeah, exactly.
So how do we do the color?
Because that's obviously dynamic range is more contrast right than color.
And so the next step was, so what's interesting is we have Dr. Harold Brundel.
who's our head of image science.
He's been around since a long time,
even before the Alexa, of course,
and he helped build the original Airy color science.
It was his project at Erie.
And he actually studied to be a biologist and human vision related to that.
So he's, as you can tell, very good at math and biology and image parameters.
And so to answer your question,
we didn't specifically model it after a Kodak or a Fuji stock.
What Harold did is he looked at what we could do with,
the camera hardware at the time because there's certain limitations with that, but also what
the displays were looking at at the time, what kind of display. So digital cinema was relatively new.
We didn't have any OLEDs back then. We didn't have HDR displays back then, right? So we needed to
make sure that it looked good even in the rec 709 standard dynamic range display. And so he built
essentially the image science of what we now call log C3 and area wide gamut three or the original
see in the original area wide gamut now that we have the fours for both of those.
And he built it from the ground up with his team, you know, based on the knowledge and the images
from the scanner, right? So, you know, I'm sure he looked at aspects of, you know, oh, I like a little
bit of this from the Fuji stock. I like a little bit of this from the Kodak stock or I think this
one has the best skin tones. Let's go for the skin tones here. And so skin tones are obviously
the reference point, you know, that's that's where, you know, Harold, my favorite thing that
Harold did, which no one else does, is he always does stuff related to what he calls memory
color. So like our competitors, they'll often take a technical try and they'll put the technical
chart. Oh, look, it hits all the points on the vector scope. Do you think any regular person
knows what the colors are supposed to be on a technical chart? No. But think about a red Coke can.
Everyone in the world knows what a red Coke can should look like. So if you can get the memory
colors, right, like your blue chlorox in the United States or your orange phanta, you know, cans that
are popular in Europe and your yellow lemon drops and your, my favorite one, the sun, so was it sun girl
raisin, the red raisin box, you know, shipping. Yeah, like those are all memory colors. So if you,
you want to be accurate with your color, but really accuracy is only with memory colors. So you don't
need to be perfectly accurate with everything because film isn't perfectly accurate.
with everything but skin tones and memory colors those are where it's really at right you know
those are things that you can immediately say that's not the right color that's not the right color
so you'd be a good person to ask this to uh semi facetiously because i know the answer uh but you know
a lot of times people will say oh oh for you know for instance i think Anasomic now has
log C3 right with your cameras yeah yeah maybe maybe that's a
different example, but a lot of people will maybe go, oh, I'm starting out as a DP. I like
coloring. I may, I use a color space transform. I put my footage into log C3 and I put the K1S
one Ludd on it. Now a shit looks like, Ari. Yeah. Why is that not the case?
It's a lot more complicated than that. It's the way. So obviously, putting in the same space
means that it's going to render similarly, and it'll allow you to use the same lots in terms
of the tonal curve, but the encoding of the camera, the way that the camera captures the
color gamut is defined before the recording, essentially, right? So in many cases, when you're
recording H264, H265, or even pro res, you're encoding the actual gamut that you want the sensor
to see at that time, you know? And so because that gamut is, you know, and so because that gamut
is not the same as, let's say, area wide gamut three
or area wide gamut four, all you're doing is trying to,
sometimes you can put it in the same envelope.
And if obviously the envelope's bigger, it's bigger,
but the camera only captured this much color.
You can only put something smaller in a bigger envelope,
but not the other way around.
But you're not going to get the same color rendition.
Another thing that also is specific to our cameras
is what's called the CFA system.
So the color filter array.
So the color filter array is the RGB's photo sites
or coverings of the photo site on the sensor.
And so you have your Bayer pattern.
You've got one line of green blue and then one line of green red and then green blue.
It's a mosaic fayor pattern.
But when you're designing a sensor, you can also define the CFA.
And there's very few people who can define a CFA because there's only so much chemistry
of what will stick to the photosites and be a stable material to filter that light through
a color.
But we designed ours to have very specific crossover points from red, green,
in blue, right? So every manufacturer has what they think are the best, you know, in nanometers,
green spectrum, blue spectrum, reds, but obviously there's no hard cut off. You roll the greens,
reds, and blues into each other. And you can do that in certain ways to give your sensor the
appearance of less noise at the cost of more color accuracy, or you can, you know, go the other way
around. There's different formulations. There's not a hundred different formulations, but there's
enough of the different ones that you can kind of mix and match. And so we're our sensor, we defined
very specific formula for the CFA, the color filter, right, versus competitors that would maybe
even just use the same CFA on every single model of camera, because that's their, you know,
agreed upon chemistry or crossover points. So CFA has some influence in color. Obviously,
you can do processing on the back end to try to, you know, if you know that map on the front end,
you can use the processing power on the back end to alter some of that. But there's also, you know,
obviously that's light literally analog coming through those color filters hitting the photo site.
And so if you, you know, have a different chemistry or a different crossover point, that's going to affect your color.
But yeah, it just gets you basically what I say when people are doing that, they want the same funnel curve.
You know, they're trying to map, let's say, an S log or a canon log to have the same way of distributing the dynamic.
And that was another mathematical formula, right?
So like when we were defining the original area log curve, we had to fit it within 10 bits because we said if it's only works in 12 bits back in 2010, 12 bits was, oh, 12 bits.
this new, very big deal. So we had to make sure that we could hold 14 and a half stops in 10
bits. And it works. Mathematically, if you do the map like we did, we fit a beautiful tunnel
curve with no artifacts in the shadows, plenty of room for the highlights in that 10-bit curve,
whereas many cameras at the time were all 8 bit really at the time, very little 10-bit,
and then obviously, you know, 12-bit came later. But one of the reasons, frustrations with Alex4,
actually in Alexa 35 was why couldn't you keep the same log curve? You know, why could you keep the
log C3 one? All my lots are built. And it's not like we just wanted to change it. You can't fit
17 stops in 10 bits with no artifacts. You need 12 bits of encoding to hold another two and a half
stops of information. So we had to go to a different log curve. And obviously that was a little
frustrating for some people. Now not so much. But I mean, all things considered the first new log curve in
12 years. Pretty good. Everyone else seems to change it every two or three years, you know.
I was going to say, you guys, obviously, you nailed it with the sensor. But I think something
that you guys do, and I guess Fujifilm kind of does to a degree, is they'll make a sensor.
And then the body type changes with the type of work you're doing. So in Fuji Films,
case, you know, they'll have range finder, they'll have pocket key. They'll have now the, you know,
the term. But you guys went like, all right, we got the classic. Now we have.
one for documentaries.
The Amera.
Then you got the mini because people wanted that.
And then the funny part, I know this, that like Canon came out with the C-700 because
everyone wanted the Alexa, you know, and they're like, I guess they want big camera.
Then you guys were like, I guess you were hearing on the back and no, everyone wants small
camera.
Right.
Yeah, yeah.
It's like a game of telephone.
Our friends at Panasonic had the same problem with the very cam too, right?
You know, the very camp 35 came out right after the Amira.
And it was even bigger and heavier in the Amira.
or any people were telling us the mirror is the edge of acceptable and now they had a bigger camera
and then by the time we jumped to the mini in 2050 the C-700 was 2016 and it was even you know as big
and heavy as the Amira and didn't do any kind of internal raw or 12 bits so you needed this
recorder that you would add to the back so it's always like you have to follow you know what I think
people tell you but also you always want to be ahead of the curve right you want to make sure that
you're getting that information before everybody else.
But, yeah, otherwise it's just a game.
You can tell in the camera world, I feel like there's a lot of a game of that,
you know, chasing each other with cameras and form factors.
I mean, form factor becomes at this point very essential for cameras, right?
You know, it's where sensors are good enough in some ways for certain people.
And the form factor sometimes like, yeah, I would like an Alexa image quality,
but I need a camera that's four inches by four inches by four inches.
Right.
You can't do that, you know.
Well, and that kind of brings up two questions I had before we get to the 35, which is one, why not make an a mirror that is in a mini form factor?
Because obviously, documentary, there's a lot of, I'm shooting a bunch of documentaries now.
Of course, I'd love to shoot, but bringing in a mirror on a plane, you know, something like a mini I could theoretically fit in a backpack or whatever.
So why was that just not a priority?
because your clients are primarily cinema and television or was there like a discussion internally
about that? It's a tricky one. I mean, the mini was really designed if we go to like the mini
story, which was, you know, a great success for us. But the mini was designed to be a drone and
gimbal camera. No one at airy. I mean, the sales numbers that we thought we would sell were in the
hundreds. We didn't think we did because they were a specialty. This is for a drone or gimbal.
They will never replace a big Alexa or even in a mirror at the time for the.
this application, but it was a bit like, oh, my God, I can get the area image quality in five
pounds. Okay, great. I'm good with that. I don't mind. Okay. It only has one card slot.
It has, you know, no audio. It doesn't have this. But oh my gosh, look at how small and
tidy. So the foreign factor and obviously the image quality won out over all the extra
features and convenience and all of that. And the style of shooting very much changed, right? Because
you had a lot of gimbals at the time. But there's also limits to how, you know, small you can
make a camera. Like I get this question all the time.
Why can't you make a smaller camera?
One thing that we do that no one else does, to our knowledge,
is that we have a Peltier cooling system behind the sensor.
So when we were designing the elects at the time,
one thing that we found was that digital sensors
are actually not very temperature stable, not like true, right?
So we actually looked at it and we said,
this is interesting.
Film, you can shoot at 113 degrees and you can shoot in, you know, freezing temperatures.
And it pretty much looks the same.
And only take care of the film.
There's no image changes based on temperature.
at all. But on a digital sensor back in, again, 2009, 2010, all our competitors had
untemperature-stabilized sensors. And so when we were designing our sensor and looking at
the valid three for the Alexa, we said, well, the image sensor had the best performance at this
temperature. Every sensor, by the way, it's like a computer chip. It has best performance at this
temperature. If it gets too hot, it doesn't work so well. And if it gets too cold, it doesn't work so
well. So what we decided to do is we built a liquid Peltier cooling system that keeps the
back, it's on the backside sensor, keeps the entire sensor from edge to edge the exact same
temperature. So there's no variations. And if it's hardly hot outside, it will cool the sensor down.
It's really cold outside. You're shooting in the Arctic. It actually warms the sensor up and keeps
it at that temperature. And that's why you never have to black balance an airy camera. All our
competitors said, well, that's expensive. It makes camera big and heavy and all that.
that we're just going to put a piece of silicon there.
And if the temperature changes by five or 10 degrees,
they'll just wait to black balance the camera.
Again, our opinion was we went and talked to a bunch of cinematographers.
Would you stop and black balance the camera for 90 seconds on a job?
No, I'm not stopping for 90 seconds, two minutes, three minutes.
Every time the temperature changes by five or 10 degrees,
that's like going, again, that's like going backward.
Why would I not use film?
I don't have to do that with film.
So with the Peltier cooling, that defines the minimum size of the camera, right?
So, yeah, would it be nice to have ibis on on a camera?
Oh, yeah, for sure.
Moving a paltier cooling system with ibis technology doesn't exist today to be able to do that in an efficient way.
So that makes our sensor block a little bigger, a little heavier, but that means you'll never have to black balance a camera for Mary, you know, which is fantastic.
And we carried that through all the games.
Well, and I think that kind of half answers the second part of my question, which was going to be why not build your version of a Komodo or.
a C-70 or something to that effect that is more affordable for the average shooter.
My guess was always like, you know, basically what you said, which is sort of not attention
to detail, but a certain standard that you don't want to come off of, but I'm sure there's been
more conversation than that.
Oh, absolutely.
It's always a question of, you know, what are you willing to, you know, get rid of?
Okay, so some people might say, I'll tate cooler.
I don't need that, you know, I can save some size and some, obviously,
a considerable cost, and I'll just black balance the camera.
Well, it's not just the Pelticula.
Like, for example, I remember when we first talked with the R&D guys and they explained
how advanced and complex, we have our own chemistry for the optical low pass filter
that sits in front of the sensor.
Another thing, when we were designing the Alexa, we said, moray, aliasing, unacceptable.
Film, no more a in aliasing.
How do we make it, right?
And the R&D guys were given a task.
We don't want to see any more a aliasing.
They weren't given a budget.
And so they came up with an optical low-pass filter that is incredible.
It's an organic multi-layer crystal that essentially means that you will never see any more air aliasing on our sensor.
But it costs about five times as much as the piece of quartz glass that everyone else uses on their sensor, which, again, would be considered good enough.
But for Airy, that's not good enough.
And so that's why, so okay, yeah, we could put a piece of quartz in front of it.
We could take off the peltie cooling, but it's like taking pieces off your Porsche to get the price down.
At some point, it's not a Porsche anymore.
You've just kind of, you know, taken all the bits off.
But that doesn't mean that, you know, we don't understand why people want that, right?
So it's a very tricky way of, you know, taking that DNA of our brand and even like looking at what, you know, my favorite story on this topic.
Like the other day, I had talked to the customer that said the same thing.
And I said, well, what about the temperature range?
And he didn't know what I was talking about off the top of his head.
And I said, well, the temperature range.
And he goes, yeah, yeah, yeah, well, what do you mean?
And I said, let me explain.
When you're building a camera, you can pick from two different books, as our German colleagues sometimes call it.
There's the book where you can get all the parts that are military, industrial grade, right?
And those parts can work from negative four degrees Fahrenheit.
They're guaranteed to work, I should say, from negative four.
hair Fahrenheit all the way up to 113 degrees Fahrenheit. All the parts in that book, your screws,
your brackets, your connectors, your pieces to make it camera, right? And I said, okay, that's great.
And I said, what about the other book? They said, well, that's the consumer. Right. So,
well, what is that? That's 32 degrees to 104 degrees Fahrenheit. So everything in that book is guaranteed
to work from 32 to 104. Now, some things might work a little bit beyond. But if you're building a camera,
chances are you might have pieces in it that don't work beyond 104 and 32. And so that's
What's what almost everyone else really uses.
We use these military, industrial grade, negative four to 113 degree parts.
And those costs on average twice as much per part, right?
The exact same part because of its robustness and its environmental hardness
is going to cost you twice as much for that connector, twice as much for that screw,
twice as much for whatever it is that you're building.
But sometimes that's the difference between being able to shoot and not being able to shoot.
My favorite story was last year, I think it was last year, last summer.
we had a customer that took the Alexa 35 and they took another camera, which shall not be named,
to the Willow Springs Racetrack in California.
The exterior temperature that day, by the way, was 116 degrees Fahrenheit as the high.
All right.
So they're out there and they got to shoot cars.
They're being paid to be out there at 8, 12 hours, whatever it is.
The other camera booted up for five minutes and was done.
Cooked itself, wouldn't turn back on, they couldn't shoot.
Alexa 35, eight hours, never skipped a beat.
So that was the difference between being able to get the shot
and being able to say, you know,
or having to say essentially, sorry, we can't shoot today.
We're going to have to go back.
The camera doesn't work.
So when we talk about how would you bring, you know,
something that's more affordable, the question is always,
okay, what do you compromise?
And again, for some customers, they might say,
well, I only ever shoot my air-condition studio.
That's great.
But if we're going to put an airy logo on the outside of whatever we make,
we don't want to have to say, well, that's the area
they can't go outside when it's very hot
or very cold. It becomes
a bit difficult. You know, you can
kind of understand. It's always about
you know, give and take, you know, when you're
building a product, especially such a complicated one.
Yeah. Well, and, you know,
it's, I hadn't really thought.
I thought of it from that direction of just like your
cameras tend to be, you know,
essentially bulletproof. Obviously,
the look is great. Obviously,
service.
and customer service is a big thing.
But the second you release, you know, an Alexa consumer, whatever,
you don't get to hold that conversation that you would have
with the people who are renting from you or rental houses, you know,
the face-to-face like, oh, no, you're not supposed to do that,
which obviously doesn't happen too much.
But some kid on YouTube is going to be like,
I bought this Alexa consumer and it sucks.
And then now you're like, I can't put that fire out.
right that's what you asked for exactly exactly yeah yeah it's hard it's like yeah when you start
it's like oh well we can make it cheaper let's take out the ND filters like but last I checked
people like the built an MD filter you know exactly I would not buy it yeah right so it's like
yeah it's very much that analogy of like at what point you take so many pieces off where you just say
you know what sorry there's probably something that's cheaper that's better that you don't you don't
you shouldn't buy ours because it's so you know handicapped or a new
in a way where you've got all this stuff missing, you know, kind of thing.
So it's really, really tricky.
But again, it's not like it's, we don't understand the reasoning.
It's just, you know, easier said than dumb, right?
Very much easier said than done.
So there's a lot of used Alexis on eBay right now, which is unfortunately, it's
over times here in L.A., but, you know, for a lot of people could be if you're trying to
step up as an owner-operator, who I'm going to say before anyone gets in the ideas,
this is only for people who are currently working.
buying an Alexa will not get you a job.
It might, but it'll be for $200.
If you buy one of these bodies off eBay, do you guys still service them?
Like, you know, I send it in and be like, this is mine now and same type of service?
Totally, totally.
So we addressed that.
Like, when I started an area, we didn't have any kind of what we called CPO program.
We still serviced stuff.
But there's like, we took a multi, you know, forked approach.
So yes, people still are by, used Alexa, client.
classics from 2010.
And one thing that's unique about us is we still service that 15-year-old Alexa Classic.
You can bring it in, get a new sensor if you really want to, probably more than the camera's
worth, but you can, then connectors and all of that kind of stuff.
And by the way, one thing that's also unique about Airy services, it's global.
So all of our competitors, if you buy a product, it's only good for the country you bought
it in.
Well, whatever I buy in Alexa 35 and I'm shooting in Brazil?
And I have an issue or I damaged my camera.
I can call Airy Brazil, and I can have them service the camera.
camera under warranty, no charge to me. Or, of course, if it's, you know, accidental damage,
I can still have, you know, our service guys down there service that as well. And it's not,
it's not a regional warranty or a regional service network. We've got a global service network,
which is a really big deal. But so what you can do is kind of a two-forked approach. If you want to
buy any old airy camera out there on the market, and first thing, obviously, is I would recommend
checking with us that it's not stolen. So we always record the serial numbers of cameras that are
stolen so that if they are ever brought in for service for any reason, we can report it to
the appropriate authorities, which I think is obviously a good piece of mind. And our service
centers will do different kinds of what we call service evaluations. They'll do ones that are
very basic, like just the sensors. So let's say, for example, maybe you don't have that much
money or you feel like the camera is in pretty good shape. You could just have the sensor evaluated
for a few hours of our labor charges, which I don't remember off the top of my head. But they're like
a high-end car dealer, you know, you know, maybe 175 an hour, something like that. And they'll
look at the sensor and they'll tell you, look, this sensor is good, doesn't have any dead pixels or
it looks good. But then you can also take it a step further, which, you know, I would of course
recommend if you can spend a little more. And we have what we call our full service evaluation.
And that's where they'll take the entire camera apart, literally look at every connector, you know,
and say, okay, this is, you know, there or there's some dust and dirt we cleaned, all that kind of
stuff, but that's, of course, a much more involved, you know, eight-hour service trip for the
camera. So it gets a little more expensive. Then they'll give you a report and they'll say,
this is what, you know, it's good. This is this. We clean this. We did that. And then if it passes
the certification or even if it doesn't and you have to get some repairs done, once they give it that
clean bill of health, they will actually be able to offer if you want to buy the extended warranty
to go with the camera for the one-year extended warranty. So we offer always a one-year
extended warranty that you can just keep buying for as long as you want. So every year, you can buy
another year, another year, another year. But obviously what we also do now, which we didn't have
before I started, was a CPO program, certified pre-owned. So what customers can do is they can
trade in their older area camera toward a newer one. We would take it. We would recertify it,
you know, clean it up a bit. Obviously, they're not going to be like brand spanking new. And then
we give it the same one year warranty as a new camera. So that's a basically,
built into the CPO cameras compared to just finding one and bringing it in kind of thing.
That's awesome.
That kind of dovetails us nicely into the 35, which is you guys had obviously had over a decade,
decade and a half of feedback, use cases from all, you know, the topist tier films all the
way down to the whatever documentary could happen to afford a mirror at the time.
So what were, you know, you get all that notes, what, what were the kind of the main concerns making the 35?
Like, why make a new sensor this far in?
What did you change with the body?
You know, what were the advancements?
And a lot of it was amalgamated feedback from all the different camera models.
Like, for example, you talked about maybe having another Amira and Amira 2.
It's probably like one of the bigger questions I get a lot as where is the Amira 2.
But this kind of touches on that, this whole story.
And then obviously, a bunch of other models were integrated into the 35.
But to start with, obviously, the first thing that we heard from many customers is that they wanted to have a native 4K sensor in the Super 35 imaging circle.
So for those of you also who remember, we had, of course, the big Alexa LF in 2018, which was 4.5K, but you had to use the entire large format sensor to get to that.
Then we had the mini LF, which is the same sensor in a smaller body, more limited in frame rates, but also 4.5.
0.5K. But we had a lot of customers say that's great, but I would like a 4K Super 35 sensor.
And obviously, we looked into this over many years. Go ahead.
Well, I was just going to say for people who are listening who don't know, like,
you know, if you're newer, you might think, oh, why wouldn't you want full frame?
Full frame better? And the answer is lenses.
Yes, absolutely.
Don't cover full frame. So you want a super 35 specific because you want to use a certain
type of lens. Exactly. There's a hundred years worth of super 35 lenses that just are not
design, because that was really the only or dominant format other than smaller formats back in the
day for film. So, yeah, that was a big consideration, of course. And obviously, we could
have said, oh, yeah, all those 35 wins. Yeah, yeah, you've watched Blade Runner, right? That looks
pretty good. Or Apocalypse now, that also looks pretty good. Those are all captured in Super 35 format.
That's just the industry standard for many, many years. You know, I mean, they played around with
larger formats, but on film, it's bigger, heavier, and more expensive a lot faster.
than it does on digital.
So the LR.
Yeah, and 16.
I pointed it a bunch of 16s instead of 35.
That's wrong.
Anyway.
So the first thing, of course, was to try to get a, you know, or to get, I should say,
a 4K Super 35 sensor.
So we had that resolution to meet the U.HD mandates that people were having, you know.
And then also obviously to offer it in a camera body that was pretty compact.
So when we were designing with the sensor, we knew, okay, we wanted to be a super 30
4K at least sensor, but we want to make sure it's nice and tall for anamorphic shooting.
So our Super 35 sensors tend to be the largest.
I think a lot of people in the air, oh, it's only Super 35.
If you look at the dimensions of an Alexa sensor like the Alexa 35s, Alex, 4, they're quite a bit
larger than Super 35.
We do that for a few reasons.
One reason is for anamorphic.
You want four perforations in height, which is a much taller imaging area.
But also, we want to have a little extra room for our look around.
That was another big thing we carried over from film cameras at the,
Alexa, probably the most love feature that converted every operator was on a lot of film
camera, you could see outside the frame you were recording through the ground glass.
On the Alexa, we had what's called surround view, and we still have it, where you can see
a box outside of the recorded image so you can see a light pole coming into the shot of
light stand or a boom pole from the top.
So that was a key reason of why we always have a little bit of an oversized sensor as well.
And we want to use the whole sensor, we call that open gate mode.
You might have, you know, heard that a couple of times as well.
So we wanted to make sure we had 4K Super 35 and we ended up with a 4.6K, a little larger
than Super 35 sensor, but 4K in the Super 35 area.
But obviously it wasn't just about resolution.
First thing that customers asked for was even higher image quality, of course.
So we were ready at 14 and a half stops.
So we decided we would, you know, push even further to get our 17 stops of the LF4 and the
Alexa 35.
And another thing people asked us for is improved low light performance.
So the LF4 is a much cleaner.
sensor than the LF3, obviously a much newer sensor, which you kind of expect, but it has much
better low light performance compared to its predecessor. So all of that was defined in a great
sensor design, also a very fast sensor too, to be able to have very minimal rolling shutter and
high frame rates. That was a key aspect, you know, coming off of some of the slower sensors
from large format, you know, from various cameras. But one of the key things with the 35 is we wanted
it all in one body. So the idea was there would no longer be this big, full,
size, you know, what people would call studio or standard Alexa, as we call it full-sized body
and a mini. The idea was to scram as much from the big Alexa as possible into the literal
smallest form factor possible. And the 35 is not much bigger and heavier in the mini. And
the 35 has an advantage that I talk about a lot when I'm talking with customers coming from
the mini, which is that even though the mini is physically smaller and a little lighter, it has no
way to attaching a battery. So you either need to rig it out to attach a battery or you need to wear a
battery in a belt or battery in a backpack, but on the 35, you can slap the battery right on the
back, and that's part of its superstructure, part of its design that we said it needs to take a battery
on the back, like an Amira or an Alexa full-sized body. And so that defines what materials you can
use. You can't use carbon fiber for that because it's not structurally very rigid to be able to
hold the heavy battery on the back. And some people grab and operate with the battery. So with the
35, it was like trying to get the best of everything. So we even looked at the Amira aspect, and we
have that audio module. I don't know if you've seen it, but it sits on the back of the 35 and that gets
us our audio inputs, including digital mic, you know, phantom power, all of that. But we don't want to
build it into the camera. I mean, you know, I guess we could have made a variant, but that's also kind of
restrictive. So we decided to make it that removable module that with three screws, you can put it on,
put the battery plate, and it kind of just sits literally an inch, basically, sits in between it for
for audio recording. So the idea with the 35 was it much more of a Swiss Army knife, you know,
kind of a do-it-all in the smallest body and still get your 4K Airy Raw 120 frames,
your lens control, your independent outputs.
It's another one.
I know once you start using independent outputs on an airy camera,
you can't imagine going back to other cameras where you're forced to see the exact same
status information on all the outputs or the same look file.
Like that's another thing, right?
So we understood very early on that it doesn't matter how many SDI spigots you stick on
the camera if they're all the same signal just again it doesn't do anyone any good because your
client sitting over a video village from coca cola doesn't want to see all the focus pullers information
they just want to see the image but the focus puller needs to have all their information and the
operator on the viewfinder that needs to be separate because the operator doesn't want to see the focus
information so on the 35 we have three independent outputs the 12 gsdi i one which can do 4k 60 with its own
you know, tone mapping for HDR, SDR, status information, false color, focus,
speaking. We have another 12 GSDI 4K60 output with its own processing. And then we have
one for the viewfinder. And so obviously that adds a lot of cost and processing power when it
comes to how you're selecting your processor for the camera. But we wouldn't have it any other way
because that's what on a big set people expect. They don't expect to have to take one output
path and everybody gets everything. So it's when you start stepping up when you realize,
okay, this is design, make everybody on set happy and efficient. It's not, oh, sorry, I have to
apologize. I know you don't want to see that and you don't want to see this. You want to give
everyone the best chance and the best tool set. And I think that's what we're always pushing
for in our products, you know, is not just literal pure specs on how many connectors does it
have. But what does it really do for your onset experience? And that's something that, you know,
like, you know, some cameras have limitations. One of my favorite ones was there's a relatively recent
camera. When it's outputting UHD, you can only output log. And you're like, oh, okay, but like,
you know, that doesn't really work because the client, you can't explain. Okay, well, then you might
say, well, oh, but that's okay. I'll put a lot box in between. Well, you got to make sure it's the
right lot. You've got to make sure it's turned on that hasn't been unplugged. You got to add it to
the rental, which means it's going to cost you more money. So now even if that camera was actually
cheaper on paper, you added the lot box and all the extra trouble. At that point, you know,
maybe I would have paid $2,000 more for that camera to have that built in.
So we're always trying to, you know, for better or worse, tackle all the problems.
And, you know, sometimes that obviously results and, you know, things getting expensive quickly.
Yeah.
There's a reason.
For sure.
I mean, you guys, I know I'm keeping you over a little, so I'll let you go here.
Oh, you're good.
You're good.
I know you guys, you know, obviously added more dynamic range.
I saw it certainly as a, even when it was first announced, like, oh, they've just taken all the good
stuff and put it in a mini body because I knew when you guys launched it I remember maybe I'm
making this up but I feel like I remember there being ad copy about how this was oh you like drones
well or you know steady cam here's this specialty cam and everyone went like nah I'm just
gonna put down on a shoulder now totally totally yeah yeah all the pictures were like that all the
advertising they're all with free fly with mobile was new yeah exactly the guy on the skates
yeah I think that was one yeah so uh but you know 15 16 years
you're probably going to notice something about the last sensor
that you're like, da, fix that.
So what went into the Alev4 sensor that you consider to be an upgrade
from the original?
Yeah.
So with the dynamic range, the push was to go to even higher.
So what we looked at was one stop in the shadows, which is what we got.
We have one more full stop in the shadows.
So that really opens up what you can grade for.
And then one and a half stops in the highlight.
So two and a half, again, I want to say airy stop.
Yes, conservative stops measured using a very precise measurement same measurement system we use with the older cameras. So, you know, you're using that at the reference point. And the big reason we wanted to go to 17 stops and push the envelope is that we're always making cameras that are future proof. Right. So in the, when the Alexa came out, my favorite, you know, thing to think about was in 2010, you couldn't see 14 and a half stops on any display. There was no display that existed in 2010 that should show 14. Even a theater display was.
you know, maybe 11 stops of dynamic range yet the camera captured 14 and a half.
We didn't have OLEDs. We didn't have HDR MacBooks or iPads or even, you know,
HDR iPhones these days. So you've got to have to always compress it down.
And so that gave you a lot of room to play with in terms of dynamic range where you want to allocate it.
But now we have these really nice LG OLED TVs, Samsung QLED TVs, MacBook pros, iPad pros.
And those all have pretty good contrast ratios and brightness. And so you're getting
two million to one contrast, 1,300 nits brightness, 1600 nits brightness. Now you have displays that
can show theoretically about 14, 14 and a half stops of dynamic range. So now all those images
are still future proof. What I love is being able to say you can take here a log C footage that
you shot in prores, Air 4442K from your classic and throw the airy HDR lot on and it's ready to go
and it looks great. But now the displays are catching up, right? So what are the displays going to be
like in 2030. And five years from now, there might be hitting 16, 16 and a half. Who knows
how many stops, a dynamic range. So we always want to create a camera sensor that will be future
proof. It's not like, oh, here's a new sensor every two years because we didn't get that one
quite right. And this one better at this or this one better than that. So the idea was with the
35 that we really were shooting for that 17 stops that we got, which is, again, a big improvement
in two and a half overall stops with it. And then low light performance. The older
sensor architecture, obviously coming from 2010, had nice big pixels. And big pixels are generally
or big photosites, I should say, technically speaking, the photosites on the sensor. Bigger ones allow
swinging out left field to get you. Yeah, right. Exactly. They're pixels when they're debaired,
but they're photosites when they're on the sensor. So with the photosites being bigger,
that was a big advantage back in the day for the Alexa. That's why we also picked those bigger
photosites is we felt it was more important at the time to have a clean 800 true native
you know base than a very noisy base ISO but more resolution so those bigger pixels to find
how many we could fit in a super 35 sensor and that's why we could only do 3.4k with the older
LF sensor is because we were prioritizing quality of pixels over quantity we all have four
obviously we need that 4k so we have to shrink the photosites down to get to that
density it's very simple math how many can you fit in that
area of the sensor, but even though they're smaller photo sites, they're actually better at gathering
light. They're more efficient because the area of circuitry that captures light is bigger in the
photo site. Because when you're building a photo site, there's other pieces of it. And so basically
it has a better, you know, fill capacity there for light gathering. So it actually means that not only
is they, they're better, but quite a bit better as well, despite being smaller. And then we can even
do really fancy stuff too. I don't know if anyone is played with what we call the ES mode in the Alexa 35,
but it's called enhanced sensitivity.
And that's kind of not necessarily just the sensor.
It is the sensor, but it's not also the processing back end.
But what it is is we had a lot of people say, oh, we want the dual data device.
So, right?
Well, you can't do that.
It's fundamentally different than how we're reading the sensor.
But how can we give you better low light performance?
And the engineering guys, really smart.
They said, well, we could use what we call black frame subtraction.
Well, it's black frame subtraction.
And I said, ah, this is a, this is a method.
where, okay, let's say, for example, you shooting 24 frames per second, right?
140 to the second, the sensor is on, capturing an image at 180 degree shutter,
140 the second.
And then the sensor turns off for blanks to give the appearance of essentially a mechanical shutter
moving over and the film being advanced.
Otherwise, now we're married.
Exactly.
Exactly.
So 140 the second on, 140 the second off, 140 second on.
So the engineers are like, well, maybe what we'll do is we'll take the black frames,
between exposure that have only noise, right?
They're just black frames.
So the only thing present in their image is the noise,
like a map of the noise.
And then in the next actual recorded image
that is captured of what you want,
we use the map to subtract the noise from the next image.
And we can only do that because our sensor is so fast.
And we can only do that because the processing architecture
can process that noise map and remove it.
So on the 35, when you shoot an ES mode,
you get incredibly clean low light performance.
And we did some tests and we said, well, 2560, 2,560 ISO, ASA, or higher is really where you need it.
So we enable it in the camera at that.
And there's only two limitations.
One limitation is it halves your frame rate, right?
So instead of 120.4K, it's 64K.
Again, if you're shooting in low light, probably you don't shoot more than 60 frames.
And the only other thing that it restricts is you can't go wider than 180 degree shut.
You know, because obviously you need the same amount of exposure time on both sides for the noise to be the same.
You can go skinnier.
So if you want to do a skinnier shutter for a little more, a little less motion, a little more staccato effect, you can.
But, you know, again, shooting in low light, you probably want to keep that shutter at 180 to let the most light in.
So those are the only two restrictions, and that allows the 35 to be able to outperform, actually, in some ways, some of the dual native cameras.
Because those really only best at those two values.
Now with a 35, you can say, uh, 25, six.
60, 3,200, 4,800, 6,400, even where you can push the camera up, and then you'll just
see a little ES.
But just note that that is baked into the area raw or the pro res, so you do need to decide
it during the camera, you know, during the shooting when you're doing it in the camera, essentially.
So keep that in mind, I do always say that, you know, if you want it, do a test and make sure
you shoot whether you're shooting in prores or area raw, the ES mode, when you're doing
low light, and you want to use it.
Well, first of all, that's fucking brilliant, because it's basically just,
24 frame per second black shading
Which
Yeah, in a way, exactly
Yeah, exactly
bouncing that fast
Yeah, yeah
But two
Um
Note that
Oh, I was going to say too
I'm really easy to please
Like whenever people do
Oh, I need 8K 120
It's like I shoot 24 frames
180 shutter
I very, you know
Even when I do slow-mo
I have a C-70
And a C-500 with C-5-R-R-R-R-R-
But anyway,
way on the C-70, it'll do 120.
I use 48.
At a certain point, it's like, take a photo.
It's still enough.
Yeah, yeah, yeah.
But even 60 can get a little like boring.
It totally depends on the subject matter.
Yeah, there's some moments where you absolutely need it.
Action sports.
Yeah.
And that was actually one of the decisions why we, you know, when we were building the
base camera, which I was involved with the Electro 35 base, I was like building coming up
with a concept, is that when we talk to a lot of customers, they said they rarely
shoot over 60 frames. So that's how we were able to decide when we were building the base and coming
up with the concept, the idea of restricting some of those specialty features to licenses to help
get that price point down, right? So we kind of identified, well, anything over 60 to 120, that's a
high, a specialty thing. It's a high frame rate thing. So that'll be its own little license. And then we did
the same with like Airy Raw. And we did the same with the open gate anamorphic. Because when we
talk to most customers, they said 24 frames, 16 by 9, 4K.
res 60 frames tops. So it's in a way distilling down the same area image quality because it's the same
sensor and the same hardware, but not having to pay for things that you don't use or you don't need
or pay for them when you need or buy them perpetually. I know early on there was this post that said,
oh, it's a subscription camera. And, you know, that's not the case. Yes, you can buy the temporary
licenses if you would like to spread it out and you don't want to ever, you don't want to own a
specific feature. But you'll always be able to buy the perpetual licenses, just
like with any of our other cameras.
It's just a matter of what you need.
Right.
Well, and candidly, obviously, I'm sure you saw there was whenever people bring up any type
of either subscription or rental or whatever, there's a lot of like angry, angry people.
But weirdly enough, I was one of those exact people that you're talking about where I was like,
wait, I don't use any, maybe the anamorphic.
But other than that, I was like, I don't need any of those.
still out of my price range
but it was like still like well and then
the same thing like if I'm on a gig that's
gonna use anamorphic and I own
a 35 base
I'll just charge like production
yeah because I'm
as DPs as owner operators we're always doing mental math
of what can we discount to make sure we get the job
and shit so if I own the thing
I'm gonna charge them for some and then I'll just full charge
for whatever the anamorphic
mod add in or some so I actually
was on your guys aside on that one even though
I very rarely use your community
That's great to hear
I'm not on that level
You know
I'm not on that level you know
Yeah
You know
That's another thing too
I think that we really worked hard on
Was trying to make sure
That the 35 base was
Configured in such a way
That was more friendly
To even documentary shooters
With the accessory sets
And trying not to put everything
In the kit
You know
Sometimes I think our you know
Previous inclination is
Oh the rental house
They need every bracket
And every piece
And it was like
Let's make these base sets
As lean as possible
And keep that price down
just what you need. And so I think it was a step in the right direction. In my mind, it's a
spiritual Amira 2 successor, right? You know, I'll say it very candidly, there's no Amirot 2 waiting
in the way. It's not like, I have one right here. We're going to pull it out and here it is
is what you're waiting. Because again, we kind of see the 35. I mean, the biggest request when I did
my research for the base is when I said, when we talked to Amira 2, I said, what do you mean by
Amira 2? And they said, well, it should be lighter and small. And I said, well, how much lighter and
smaller. And they said, I don't know, like, I don't know, 20, 25%. Well, it turns out with the audio
module, the 35 is 20% lighter and about 20% shorter than an mirror. But the auto module can be
removed when you don't need it. So actually, in a way, you actually have kind of an even
smaller camera when you don't need it. So we are, you know, and of course, there's specifics.
They have long time in mirror owners that are sports guides, especially that love the user switches
and the user buttons. And we just don't have room for that on the 35 because it's a smaller
button. I am one of those people. I love customizable buttons.
Sure. Everyone has their own thing that they need to, you know, the one thing that makes them feel
safe. You know, for me, it's not even, you know, EL zones being on a button, ND on a button.
Yeah. Yeah. Yeah. That's about it. You know, some people want focus peeking, though. You know,
some people want the wave forward. Totally. And we're looking into it too. We had some customers ask if we
could ask, add like a shift function. So the Amira had four buttons, but it had a shift key. So
So if you put the shift key, you could get to the other.
So we don't have a shift key dedicated on the 35, but maybe there was a way we could do that.
And another idea could be that maybe we do a long press so that every user button, the six of them, there's the six short press functions and maybe six long press buttons functions as well.
So there's there's always things we can do.
Obviously with the 35, the real estate was the issue.
We didn't have so much room for the, you know, as we did with the Amira.
But it's really challenging, right?
Because these days, you know, people seem to have way different needs than what they used to, less, more predictable.
And when they're buying a camera, they want it to serve a lot more needs.
So it's always a very big challenge when it comes to form factor because you don't want to pigeon yourself in and then have like four different camera models.
That's just a nightmare for both the customer to choose from, but also from a manufacturing and cost and inventory perspective is how do you do that?
You know, you're trying to reuse pieces, you know, and that's another reason why the base is what it is.
You know, could we have started taking out connectors and saying, oh, this is the neutered Alexa 35 and it doesn't have a, you know, sync connector.
It doesn't have a second SDI.
We could.
But at the end of the day, people would have probably just said, well, you know, it's not really worth the savings.
I'll just buy the fully voted one if I'm not going to be able to ever.
And that was my idea with the base was really to be an aspirational camera so that, yes, you could walk out for $57,000 with three drives, which is still a lot of money, but you could start shooting at $57,000.
And at the end of the day, you would actually not pay if you bought all the licenses or just the big premium license, you wouldn't pay anymore had you bought the Alexa 35 premium right out of the box to start. And so I think we've completely glossed over this. The base model is your payment.
I was one of the primary product managers. Yes, exactly. Yeah. So I think it's honestly a smarter. When you guys are still, I probably should have this up, do you have like, I assume you have to have payment?
plans for your size. Oh, yeah, yeah. So we have financing. So in the U.S.
Airy sells through several dealers. And then we also do sell direct, of course, you know,
which we've done for many, many years. And through the dealers, they might have different
financing, but they usually also offer our financing. We have a financing partner here in the
US called Quail Capital. They're based in California. And they will basically, all you need to do is
get a quote from us, get a quote from the dealer for whatever package you want. And I think
they'll do up to five years of financing, so 60 months.
And I think it's entirely based obviously on your credit.
How many months?
Do you want to put a down payment, you know, you want to put like 10 grand down?
So, yeah, it's these days, a lot of financing options.
The rates aren't as good as they used to be, but they're competitive still.
You're not going to, you know, spend any more than you would with the equivalent, you know, auto loan or something like that.
Sure.
You know, they kind of just are what they are.
Yeah, absolutely.
So, yeah, we do obviously, you know, basically a financing program.
Yeah.
Cool.
Is there anything I didn't touch on that you were stoked to talk about?
I think we could do.
No, you covered most of your 65, but.
Yeah, yeah, you could.
Absolutely.
No, I think that was that was it.
That was a great structure.
I was glad we seemed to flow and fill it up all the time, which is, you know,
always good.
But no, I don't think there's anything else for, for this one,
unless you think there's any, like, other questions about why or what?
I think we covered a lot.
You really did.
Yeah.
I mean, this could be, so in five years, I've never really had very,
much feedback on this podcast.
Okay.
But obviously a lot of the time it's just hearing from DPs talk about their work and
there's not really, you know, what we're, hey, can you get them back and ask, you know,
so maybe this is.
Yeah, yeah.
We can just be like, hey, if you have any questions, we could do a follow up in like six
months or whatever and just get into a round questions.
That would be fun to see what everyone asks.
Yeah, yeah, yeah, absolutely.
Let's, yeah, totally plan for that.
Absolutely.
Yeah, six months will be interesting too, you know, because that'll be people might even have a chance
to use a camera on a job and then have come back with a question rather than, you know,
just, you know, a question just in the top of their head. So yeah, we're planning to do that.
That sounds like fun. I love to do other question. And then next week I talk to my old colleague
from Pro Video, Art Adams. Yes, Art's going to talk to you about the lenses, the thing that
happens before the camera. We talked about, if you will, the back half. But the lenses are just as
important. And I'm sure we'll talk about HDR as well because I don't think that's a common, one of
My things I always have to remember is that you can't get HDR from the camera if your lenses can't do HDR.
And I will talk about that because if you've got a very milky, low contrast, low sharpness, you know, low resolution lens, sensor is going to see what it seems, right?
So you've got to be careful what you're putting on the end of that.
Obviously, there's reasons to use specific lenses, period pieces, all that kind of stuff.
But if you want your stuff to look good in 20, 30 years, I mean, think about how lens stuff looks with funky, iris starbursts,
from the 80s, you know, the broadcast cameras
where you're just like, you know, kind of
thing. And that's burdened the image.
Yes. Steve here at the ASC
was showing us the very
first, they have the
very first anamorphic
adapter. Oh, wow.
That's a flat on the front. Yes.
Right. You know, and I fucking held the thing.
Put it in front of the FX6 and it's
just milk.
And then he went and grabbed
about 30 years later
so this would have been maybe like 19,
2020, something like that, 1940.
Okay.
Yeah, 40s, exactly.
Cinema scope.
It was a sim scope.
That same thing, though, it just turrets on to the front.
And it was like, exactly like you're saying.
It was like going from SD to HD.
Yeah, exactly.
Oh, my gosh.
Yeah, exactly.
Yeah, it's a big consideration.
Lenses, it's, again, I think maybe we sometimes, I like the word,
but area, we like to use the word future proof, right?
We don't make tools that aren't future proof.
And the lenses, and another big part of that,
that need to be future proof.
I mean, people are still using, you know, Ares Eye Super Speeds from the 80s.
Those were very good lens at the time.
Look at them today.
We say, oh, well, they're not as good.
But even, you know, Master Primes that Roger Deacons is a big fan of, those were released in 2005.
We didn't even have it, you know, we barely had the D20 in 2005.
And those lenses were the best that you could make optically, right?
With lenses, it's always that thing of, you know, they're going to last for 20, 30, 40, 40 years of camera.
You want to try to make them the best they can be to fit onto the next generation of
camera formats. You don't want to start with a very
four lens. I'll keep this for 30 years.
It's, you know, kind of
covering to it. And it's something
that I've mentioned a lot,
which is
the reason that lenses from the
past are still so
relevant is because
for the longest time, you know,
film wasn't that
resolute and especially digital sensors
were not. So you wanted sharper and sharper
higher resolution. Yeah. And
And then the second you guys came around and read and, you know, maybe Panasonic and then Sony, obviously, later, now you can see everything.
So people like, whoa, got that down.
So people, people want to be tuned to lenses now, which is, you know, but it's it, it is exactly the point in which digital became a thing that the need for sharpness in lenses started cratering.
100%.
We never heard that master primes were too sharp in film because film was such a low resolution medium.
you wanted that contrast and sharpness to be recorded, to punch through that analog film.
But the minute, even Alexa, which is 2.8K, or let alone a 6K camera, you start putting master
primed. And you go, oh, my God, I can see their pores. I can see this. And you go, it didn't
look like that in film. Well, it's a different medium. The lenses, master primes were built in 2005.
They were designed for film. They were not designed for a digital sensor in terms of their
MTF and their sharpness curves. So they can be a little unforgiving. And people want to put
filtration or something. Yeah. Yeah.
Absolutely. It'll be a fun one with Arc.
Yeah. Well, I'm getting a call
from Joey from Tested, so something
has probably gone wrong.
But I will stay in touch and
we'll do this again in six months. I'll gather questions.
Sounds like a plan. All right, Kenny. Thank you so much
for having me on. All righty.
Take care. Have a good one. Bye.
Frame and Reference is an Albot
production, produced and edited by me,
Kenny McMillan. If you'd like
to support the podcast directly, you can do
so by going to frame and refpod.com
and clicking on the Patreon button.
It's always appreciated, and as always, thanks for listening.