CppCast - C++ in the Visual Effects Industry

Episode Date: February 18, 2016

Rob and Jason are joined by Paul Miller to discuss C++ in the Visual Effects Industry. Paul is a partner and lead engineer at Digital Film Tools/Silhouette FX. He has been writing visual effec...ts and image processing software for over 20 years, and has been using C++ for most of that time. He started his love of graphics and digital music on the Amiga in 1986, teaching himself C with K&R and the Amiga ROM Kernel manuals. In 1992 he ended up Wisconsin, writing software for the relatively new digital post production industry on Silicon Graphics workstations, and has been writing widely-used tools for that industry since. He uses Qt for cross-platform UI, Python, OpenGL, and OpenCL extensively. He holds a private pilot's license and enjoys going to movies and beer festivals. News JavaCPP A bit of background for the unified call proposal Natvis for C++/CLI Available to Preview in VS2015 Update 2 Paul Miller @fxtech_paul Links Silhouettefx Photo fx (iOS App)

Transcript
Discussion (0)
Starting point is 00:00:00 This episode of CppCast is sponsored by Undo Software. Debugging C++ is hard, which is why Undo Software's technology has proven to reduce debugging time by up to two-thirds. Memory corruptions, resource leaks, race conditions, and logic errors can now be fixed quickly and easily. So visit undo-software.com to find out how its next-generation debugging technology can help you find and fix your bugs in minutes, not weeks. Episode 45 of CppCast with guest Paul Miller, recorded February 18th, 2016. In this episode, we talk about C++17's unified call proposal.
Starting point is 00:00:54 Then we talk to Paul Miller from Digital Film Tools. Paul tells us about his experience making C++ tools for the visual effects industry. Welcome to episode 45 of CppCast, the only podcast for C++ developers by C++ developers. I'm your host, Rob Irving, joined by my co-host, Jason Turner. Jason, how are you doing tonight? All right, I'm kind of anxious to find out if I've been accepted to speak at the next CppCon. Or C++ Now, excuse me. When are announcements going out for that?
Starting point is 00:01:46 I believe the 22nd, so in four days. Okay. We'll have to reach out to all those speakers once that announcement goes out. Yes. So just really quickly, I wanted to mention that this is, I think, our one-year anniversary to the day since the first episode I recorded with John Kolb. Wow. Yeah. To the day. I didn't realize that. I believe it's to
Starting point is 00:02:06 the day. It was either today or yesterday. I checked the date and then forgot what day it was. I was checking the date. But this is a leap year. Does that affect anything? I'm going to pretend it doesn't. But I think that's pretty good. 45 episodes in one year. We didn't quite
Starting point is 00:02:22 hit one episode a week, but we came pretty close and hopefully we'll do a little bit better next year with less moving and relocating and all that sort of stuff. I still think we did pretty good. At least you did until I started joining you with it.
Starting point is 00:02:39 Anyway, at the top of every episode, I like to read a piece of feedback. This week Steven sent us on Twitter a message saying, any chance we can get an interview with some of the Lumberyard devs on the show? And he had this link to Amazon Lumberyard, which is a game engine. It says it's a free AAA game game engine deeply integrated with amazon web services and twitch and it has a c++ sdk so i had not heard of this before but uh it definitely sounds like
Starting point is 00:03:14 an interesting project and i will have to see if we can find some of the developers for this and see if they're interested in coming on the show uh I'm not sure if there's any current games already using this project. It was just announced this month, and I don't really know the history of it. So I guess there's probably not any games out yet, or maybe they made a few sample ones. Yeah. You would think. As always, though, if any of our listeners know any of these Lumberyard devs,
Starting point is 00:03:44 send them our way. Yeah, definitely helps to have them come to us Always, though, if any of our listeners know any of these Lumberyard devs, send them our way. Yeah, definitely helps to have them come to us because it's not always easy to get in touch with developers at Amazon or Google or companies like that. So if you know someone, please send them our way. Anyway, we'd love to hear your thoughts about the show as well. You can always reach out to us on Facebook, Twitter, or email at feedback at cppcast.com. And don't forget to leave us reviews on iTunes. So joining us today is Paul Miller. Paul is a partner and lead engineer at Digital Film Tools Silhouette FX. He has been writing visual effects and image processing software for over 20 years and has been using C++ for most of that time. He started his love of graphics and digital music on the Amiga in 1986,
Starting point is 00:04:27 teaching himself C with KNR and the Amiga ROM kernel manuals. In 1992, he ended up in Wisconsin writing software for the relatively new digital post-production industry on silicone graphics workstations and has been writing widely used tools for the industry since. He uses Qt for cross-platform UI, Python, OpenGL, and OpenCL extensively. He holds a private pilot's license and enjoys going to movies and beer festivals. Paul, welcome to the show.
Starting point is 00:04:53 Thanks, guys. I'm happy to be here. So how did you end up using the Amiga ROM kernels to teach yourself C? Well, I was in high school at the time. And I mean, in high school back in the 80s, there were no computer science classes. I mean, they were teaching Pascal, sort of. But I heard about this Amiga that was coming out that had, you know, these great high resolution graphics and the sound engine and all that. And I didn't know C, but I had picked up enough information about it to know that it was all written in C. And I didn't have enough money to buy the Amiga yet. So I ordered a copy of the ROM kernel manuals and I just started
Starting point is 00:05:35 reading them all the time. And then I picked up a copy of the Cunningham and Ritchie C book, and I started learning C c on paper and then when i was able to afford the amiga i kind of knew what i was doing at the time so i kind of learned c first and then applied it to the computer when i got it so did the amiga come with a c compiler um no i think that was an extra okay yeah and i don, I mean, it's been a long time, but there was a company that made the compiler, and it wasn't that expensive.
Starting point is 00:06:12 And it ran off of one floppy, because this was before there were hard drives on this thing. Yeah, and then I set up, I had an external floppy drive, and made a dual floppy drive kind of setup
Starting point is 00:06:26 where I had the compiler on one and then I would compile the code down to the other floppy. But yeah, those were the days of, obviously, managing your own memory. I think that was probably the biggest, the most important lesson I ever learned as a new programmer was
Starting point is 00:06:44 if you leak memory, you don't get it back. This was in a Unix-based system, so if you leaked memory, it was gone forever until you rebooted. So I got in the habit of, yeah, there was a tool on there called AvailMem, and it would you'd type it in, and then it would tell you how much
Starting point is 00:06:59 memory was available, and then you'd run your program, and then when it exited, you'd run AvailMem again and find out how much memory you leaked, and then you'd run your program and then when when it exited you'd run avail mem again and find out how much memory you leaked and then you would start tracking down your memory leaks and so i i gained a a huge respect for memory management real very early on in my career as a programmer so c was the first language you learned uh actually i taught myself BASIC when I was in fifth grade on a TRS-80. I think it was a Model 2. And I was in fifth grade at this little school in Virginia, and they got one of these computers in the library.
Starting point is 00:07:38 And I was one of these kids that was always taking stuff apart, and the principal was giving me old tape recorders and things to take apart and stuff. And they had this computer sitting out there, and one day they were like, Paul, go figure that out. And they would send me out during the daytime during class to go just fiddle with this thing. And I taught myself BASIC on that. And then later I got an Atari 400, which was all, and it had another BASIC on it. So I taught myself BASIC through those systems.
Starting point is 00:08:04 Okay, I was going to say that if you learned C first, and it had another BASIC on it. So I taught myself BASIC through those systems. Okay. I was going to say that if you learned C first, then you would be, I think, the only programmer I knew who learned programming in the 80s and did not start with BASIC. Yeah, I think it was pretty hard. I mean, either you start with BASIC or Assembler if you really have a strong desire. I don't know.
Starting point is 00:08:24 I guess B was basically, basically was running on everything right back then. Yeah. Um, but, uh, yeah, I,
Starting point is 00:08:30 I, I learned a little bit of assembly and then luckily the Amiga came along and then just see, just kind of fell on my lap. I never even really kind of settled into learning Pascal at all. I kind of was able to skip that. Fortunately. Good for you.
Starting point is 00:08:43 Yeah. It was fortunate. Yeah. Okay. Well, we have a for you. Yeah, I was fortunate. Yeah. Okay, well, we have a couple news items we wanted to mention, and then we'll start talking to you, Paul. But feel free to chime in with any of these news articles. The first one is a new project called Java CPP. And, Jason, I thought you'd be interested in this one.
Starting point is 00:09:01 They're kind of saying it's an alternative to Swig, I guess, in that it'll generate C++ code for you or Glue code for you to go from Java to C++. Yeah, I read through this a little bit ago when I saw the announcement for it, too. And I got a little lost as to exactly how much work it does for you and how much boilerplate you have to do yourself. It sounds like it has a parser that can do some of it. But it does look like an interesting alternative to Swig anyhow. Because I know I've seen lots of complaints that the Swig bindings that are generated for various languages are not optimal.
Starting point is 00:09:38 I got the impression that you didn't have to do any sort of annotation. It just works. But I'm not sure how much i believe it yeah i like i said i got a little lost because if you look down into the examples on like using uh complex c++ types they show you doing a bunch of like annotations and stuff in your java i don't know if we have maybe we should get one of the developers of this project on here. Yeah, we'll have to reach out to them. It does sound like an interesting project. The next one is a post from Bjorn Stroustrup about the unified call proposal.
Starting point is 00:10:22 And I had not really read too much about unified call proposal, but it was an interesting bit of background for it, how basically you have two possible calling syntaxes, you have kind of the object oriented method or free functions and being able to use one in place of the other. It's very interesting. Did you have any thoughts on this, Paul? You know, I hadn't, I also hadn't't heard i don't really keep up with the standards um proposals um but i i read the article today and i it was interesting i i kind of have some to uh what i've been doing lately i've been kind of designing a new api uh for some of the systems
Starting point is 00:10:59 that i've been redesigning and i come across these exact same kind of things where i want to make it kind of you know reusable from a um you know like a a functional kind of aspect or versus you know having the object-oriented approach to it and i immediately saw the advantages to it for my own projects um and i can i can only imagine i mean they probably came right out of the whole you know stl and and boost and what they you know, the functional aspects that they want to kind of, and making it more extensible. I mean, I also read the comments, too. And, you know, they were talking about, you know, some were, you know, really opposed or really for it. And I thought they were pretty insightful.
Starting point is 00:11:40 I kind of, I think I could make use of it if they went forward with it, for sure. Yeah, I read some of the comments on Reddit as well, and it seems like originally they were thinking of having this unified call proposal kind of work both ways, where you could have the object-oriented notation work as a functional style or have a functional style work in the object-oriented fashion, but they decided we're not going to go from functional to object-oriented because... Yes. Yeah.
Starting point is 00:12:11 Because there were too many complaints about it. Because there were too many complaints about it, basically. But then they kind of put out there, you know, let's do it the other way around, which has arguably much less use. Right. Right. So that kind of confused me. But it sounded like it was harder to implement maybe the other way as well, but certainly that was most of the backlash that I saw was,
Starting point is 00:12:34 no, no, we want it the other way. We want the functional-to-object-oriented mapping. And I would agree. So some of the talks from C++ now from last year are finally getting posted to youtube and so soon hopefully there will be one that was called functions want to be free and it was by one of the guests that we have had on here previously and his whole point is that most of the stl can actually be implemented as free functions, and it greatly simplifies the container logic that has to be implemented. But to really pull that off, you would need the unified call syntax working the other way that is not being accepted in this proposal.
Starting point is 00:13:22 Okay. So I was really actually, I was personally hoping that the unified call syntax would work like everyone else was complaining. I was hoping it would work the other way. Yeah. One of the comments had a really great example of where standard string doesn't have a lowercase function and he wanted, he wanted to write,
Starting point is 00:13:39 add one and it would give you the ability to add that function to standard string and be able to call it as an inline sort of object-oriented function. Right, string. Yeah, exactly. And then be able to chain them together in a very elegant example. And he kind of nailed it. And, yeah, it seems kind of – I don't know where it is. Yeah. I don't know where it is. Yeah, well, one thing to point out is there's one of these comments in here from Bjorn himself saying,
Starting point is 00:14:08 it's not me you have to convince. He was a proponent for going from functional to object-oriented, but there is too much other barriers in his way, I guess, which is a shame. Okay, this next one is coming from the Visual C++ blog, and this one I was personally very excited about, where if you're using C++ CLI and debugging in that realm, the debugging support is going to get a lot better with update to Visual Studio 2015. Currently, there's no good way of looking at, you know,
Starting point is 00:14:48 kind of drilling down into variables and seeing what they look like. And that's going to be a lot better, including with complex objects like vectors. I'm not sure if you guys wanted to really bring up anything else here, but I know I'm very excited about this. I'm guessing you're the only one who uses this feature. I've been spending a lot only one who uses this feature. I've been spending a lot of time in CLI lately. I've never
Starting point is 00:15:10 used it. Anyone else who lives in that realm I'm guessing is going to be excited about this too. I'm excited for you. I have to debug cute objects in Xcode every now and then because which is a pain, because there's no way to visualize them there.
Starting point is 00:15:29 So I feel your pain. Do you use the available visualizers for Visual Studio when you're debugging Qt? Yes, and they're great. Yeah. Yeah, Visual Studio is pretty great with native C++, but for some reason, when you added CLI into it, you would think the debugging support would be even better, but for some reason when you added CLI into it, you would think the debugging support would be even better,
Starting point is 00:15:47 but for some reason it was worse, and it looks like they're finally fixing that. So, Paul, can you tell us a little bit about Silhouette, the project you're working on, to get us started? Sure. I have been doing visual effects software for a long time. I kind of got in on the ground floor when film production went from analog to digital. And this happened in the early 90s.
Starting point is 00:16:12 And back in the day, I wrote a software for doing morphing, which was called Elastic Reality. And this was used quite a bit for morphing. Part of the aspects of it was we had a very comprehensive roto-spline interface where you could use Bezier curves to draw very complex shapes, many of them,
Starting point is 00:16:38 and animate them and track objects in the film frames over time. We eventually found out that people were using it a lot because the spline interface was so good that we're using it to start to cut out shapes from film frames. And this is called rotoscoping, and it's used a lot in movies or TV shows
Starting point is 00:17:04 where they need to extract live-action elements from a background and then layer in 3D elements and things, which is used in practically every movie these days. So later on, 10, 15 years later, I got together with some guys in California that had a production company, and we decided to make our own rotoscoping package, and that turned into Silhouette. And we started that in 2004.
Starting point is 00:17:31 And so Silhouette is basically a fix-it-and-post kind of package. It does rotoscoping, which I've just talked about. It does paint. It does some compositing and keying. And it's one of these tools that is used quite a bit to clean up film frames or extract elements from the scenes, like I just mentioned, or for doing simple keys or extracting a key, like a blue screen or a green screen element. And it's just sort of like a little Swiss Army knife tool for things like that. Interesting. So when you talk about pulling out
Starting point is 00:18:03 things from film, you're talking about like if someone's kind of on flying on wires or something like that, you're taking the wires out of the shot, that sort of thing. Yeah. The paint, um, the paint component of it is used a lot for that, uh, taking wires out, removing rigs from shots where they've got, you know, cranes or booms or cameras or these days. Um, there's a lot, most most most things are shot on blue or green screen and there's tracking markers everywhere that they would you know do a tracking uh operation on which would figure out the camera move in three three-dimensional space and then after that's all done they needed someone has to go through and paint out all those little markers that they
Starting point is 00:18:40 paint everywhere um so that's used quite a bit in the paint part. The rotoscoping part is used more for, it's kind of hard to describe. You guys remember King Kong? Yeah. Okay, so Silhouette was first used in the movie King Kong back in 2005. And in that movie, there was a lot of actors running around in a jungle set with
Starting point is 00:19:10 practical trees and vines and grass and all sorts of things. And then they had to composite in behind these actors King Kong and giant lizards and all sorts of things. So they had to go through frame by frame and remove the actors and the grass and the trees and all that at various layers and isolate
Starting point is 00:19:31 them so they could put the 3D models and things behind them. And then they layer everything back together. So that's the process of rotoscoping. It's pretty time-consuming. We've got tools for doing interpolation across time. I've seen shots where their animators have literally rotoscoped individual blades of grass or snow. They shot, for whatever reason, scenes with falling snow, practical snow, and then they had to isolate everything so they could put computer-generated imagery behind the snow. Oh, wow. And just crazy stuff over the years.
Starting point is 00:20:12 And Silhouette has sort of gotten a reputation for being the go-to tool for that because it's pretty efficient and it's got good memory management and can handle large images pretty well and lots of them. And, yeah, it just kind of turned into one of those de facto industry tools memory management and can handle large images pretty well and lots of them and um yeah just kind of turned into like one of those de facto industry tools for doing things like that so this is not like a real time kind of thing right you like have to set it up and then do like a rendering or something yeah there is a rendering stage usually um uh it's certainly not real time there's a lot of tedious, labor-intensive work
Starting point is 00:20:45 that the artists have to do. And they go through frame by frame and have these shapes that kind of get animated over time. And then when they're all done, they do a rendering process that will go through and save everything usually into an EXR file format, which is a floating point file format. Okay.
Starting point is 00:21:06 I think I've heard EXR in the context of HDR? Yes, exactly. The native format of it is a half float, 16-bit float per channel. And Industrial Light & Magic sponsored that format several years ago, and it's sort of become an industry standard. So does each frame have to be saved off like that?
Starting point is 00:21:28 Yes. There's rarely any movie formats. We supported QuickTime for a while, but that's really fallen out of favor. Everything's going to dynamic range. So usually we're dealing with sequences of images coming in and sequences of images going out. Okay.
Starting point is 00:21:47 Maybe I'm going off into the weeds here, but now I'm thinking individual frames in HD or probably what, 4K or higher or something, right? In HDR. That's got to be really a lot of data. Oh, yeah. a lot of data oh yeah i mean you know the modern movie productions usually have terabytes of just assets you know in terms of all the shots and the the the scan if it's on if it's shot on film you know they have to scan it down to something which is usually dpx or cineon and now exr but a lot of the films are shot digitally now so they just start out with a you know a raw digital frame but then not to mention the you know all the 3D models and the textures and
Starting point is 00:22:25 things like that. But even for a non-visual effects heavy movie, all these shots go through the pipeline as these big frames. I mean, I think 4K EXRs are probably about 20 to
Starting point is 00:22:42 30 megabytes a piece. So per frame? 20, 30 megs? piece. So per frame? 20 to 30 megs? Yeah, per frame, yeah. Wow. And then we've got companies doing shots at 8K. And these IMAX shots, they start out at 16K. And then sometimes they'll scale them down to 8K to work on them.
Starting point is 00:23:02 But it wasn't even long ago when 2K was the standard. Now it's usually 5K. down to 8k to work on them but you know i mean it wasn't even long ago when 2k was the standard now it's usually 5k and then um if they're doing anything in imax is probably 8k so your product is part of the pipeline of processing these images so do you guys have to deal with the terabytes of data are you just dealing with one frame at a time well we we you know handles we bring sequences in and sequences out, but usually those sequences are living on a sand somewhere,
Starting point is 00:23:30 a high-speed network, gigabit network, and we just treat them as files. Okay. In our package, we don't really care too much about the size of them. Although there is the issue
Starting point is 00:23:44 for performance, you want to be able to read them and write them as fast as possible. Um, but sometimes the network is, is usually the, well, usually the network is the, is the bottleneck. Right. Interesting. Very interesting. So I guess you kind of already answered this question before we maybe go a little more technical, but it sounds like this product is used in plenty of movies that the audience would be familiar with. You already mentioned King Kong. Yeah, I mean, another thing that Silhouette's used for
Starting point is 00:24:12 is the whole subject of stereo conversion. I'm sure you've probably seen some movies that were converted to stereo from, you know, well, like Titanic. You mean like 3D stereo? 3d stereo sorry yes yeah so most movies it's very rare that a movie is shot in 3d um prometheus was shot in 3d the hobbit movies were shot in 3d um there's been several other that recently that i that i can't really remember off the top of my head but a lot of them are converted to 3d after the fact so they shoot it in 2d and then they have a they go to a company that goes through frame by frame and rotoscopes
Starting point is 00:24:51 every single object to pop it out yep and then puts it into its own layer and then there's a process that they can uh set up a virtual camera and then sort of it generates the pixels that would be where the, you know, between the two eyes and fabricates two frames out of each frame. And then they stitch everything back together. And then an artist will usually go in and paint back in anything that might have been missing. You know, if you have an object and you look behind, you know, if you look behind it with one eye and the other eye, each eye sees something different. And so at some point you've got to go back and then fill in what was there
Starting point is 00:25:28 so that each eye can actually see what was behind the object. And a lot of times that's just an artist literally painting that thing back in. And so we kind of got lucky because in the dawn of this sort of post-conversion to 3D, we came out with a whole set of features for doing 3D rotoscoping and 3D paint. So version 3 of Silhouette, which was about five years ago now, you could paint and rotoscope on both eyes simultaneously.
Starting point is 00:25:54 And that was sort of a new thing. And right then is when the 3D thing hit. So it kind of became one of those indispensable tools for that. So pretty much any 3D movie you've seen recently that was post-converted went through Silhouette at some point for that reason as well. It's very, very time-consuming.
Starting point is 00:26:14 But probably one of the most well-done movies that was post-converted that got huge accolades was when Titanic came out in 3D. I don't know if you saw that in 3D, but it was stunning what they did to it. It really looked like it was shot that way. And then you have hybrid movies too, like Gravity, where most of that movie was shot
Starting point is 00:26:35 with all CG. And so they literally generated both eyes at the same time for that, except for when they had live actors in it, and then they did post-conversion for those shots. So it's a combination of the two. That's pretty crazy. I never really thought about how that process worked,
Starting point is 00:26:54 although I did hear about it when they started doing the conversions. Yeah, and they started out really bad. I think there was a Clash of the Titans that got really bad. That kind of put 3D back several years because it was just poorly done, and there was just a lot of bad press about that. But then they started getting really, really good. And now it's sort of the preferred way of doing these things.
Starting point is 00:27:19 And there's companies that have been set up all over the world that just do this, and they do fantastic work. The new Star Wars movie, for instance, was converted to 3D in post, and they already signed agreements to do the remaining two as well, to post-convert them. It's just easier. When they did Hobbit, did you guys see the Hobbit trilogy at all? I saw the movies, but not in 3D.
Starting point is 00:27:44 I saw all three of them in the high frame rate 3D IMAX. Yeah, the high frame rate. So that was a really neat thing because they were shooting, they shot all of them in 5K. Well, they shot the first two in 4K and then they shot the last one in 5K. And at 48 frames per second in stereo. So you take a normal movie, which at the time was 2K, 24 frames per second, and now you've got probably 16 times now the amount of pixels that you have to manipulate for Hobbit. It was really crazy.
Starting point is 00:28:18 Yeah, so that was fun. But we were talking about resolution earlier. Because I remember when I saw my first 8K plates come in, images come in, we had some users. And this was, oh, I don't know, seven or eight years ago. I had some customers emailing and like, we're working on 8K images and it's slow. And I'm like, oh, well, yeah, that would be true. You know, you're lucky you have a graphics card that can do AK textures at the time because we use OpenGL for the display.
Starting point is 00:28:52 Everything gets loaded into a texture. And so, you know, I gave them some of my ideas about how to speed things up, and we used proxies and kind of work on things. And they kind of went away for a while. And then about a couple weeks later, I went to the theater and I saw, I think it was Batman Begins.
Starting point is 00:29:09 It was the one with the Joker. And they had three scenes in that movie that were in IMAX. And I'm like, oh, there you go. Those were the shots they were working on in IMAX. Because we started out at 16K and then down-res everything to 8K and then projected them at 8K. And that was sort of one of those little gotcha moments for me. We're like, oh, that's what they were working on
Starting point is 00:29:29 because it's the first time we've seen that done digitally like that. So I assume at this point you have no problem working with those 8K images. Now that you're used to it, that's a normal thing. Yeah, well, graphics cards over that time have just gotten so much better. Oh, okay. And, yeah, I mean, really there's a bandwidth issue, but, I mean, computers and, you know, hard drive speeds and network speeds and the graphic cards and the bus speeds,
Starting point is 00:29:57 they're all catching up or they're all going in lockstep with this, with this increase in resolution. And that's helped a lot. A lot of times we're slightly on the bleeding edge of what people want to do. And so we still have to think about performance and memory management because it's not always there, you know. And there's some fairly simple tricks and there's some more complicated tricks for dealing with it. But, I mean, with something like Silhouette, we brute force a lot of it
Starting point is 00:30:27 and just rely on having a lot of memory. Memory is so cheap, you know, just add another 32 gigs on or whatever. And they all do that. You know, it's not a big deal. And the graphics cards, you don't even really need a really good graphics card for what we're doing because it's just really more about the texture sizes and stuff. So you don't need a fancy Quadro card with 8 gigs or anything like it
Starting point is 00:30:49 in there. Gaming cards, people use those a lot. Hardware-wise, we're pretty good. We're not doing 3D rendering. That's the thing. Most of the performance problems are there where you've got render farms with thousands of cores the performance problems are there where you've got, you know, like, um,
Starting point is 00:31:06 a render farms with thousands of cores and that still take three days a frame, you know, to render these movies, just nuts. Um, let's see if we can get a little more technical. I know you mentioned open GL a minute ago.
Starting point is 00:31:19 Can you talk about what platforms silhouette runs on? Yeah. So, um, well we have to, the effects industry is kind of, it's mostly running on Linux. There's a lot of the bigger studios are running Linux. They've gotten their own file systems.
Starting point is 00:31:35 They have their own kernels. I mean, these guys go really nuts about having a stable platform that they can control every single, you know, aspect of. So we have to support Linux. But we also run on Windows and Mac. There's a pretty good number of studios running just Macs.
Starting point is 00:31:54 And then there's a lot of the big studios in India and China that are running Windows. So we're completely cross-platform. We use the Qt framework for the UI, which has been fantastic. Back when we first started writing Silhouette in about 2003, 2004, Qt was still kind of an unknown.
Starting point is 00:32:17 I really wasn't that aware of it. And so I actually started out in writing my own sort of UI abstraction toolkit, and then I discovered Qt, and I'm like, oh, this is so much better. So I saved a lot of time doing that. Yeah, did that answer your question there? Yeah. So in your bio, you mention OpenCL. Are you using OpenCL for processing of these images?
Starting point is 00:32:42 So we're not using OpenCL and Silhouette. Okay. We have another, the parent company of Silhouette is called Digital Film Tools, and we do, it's the same people. We work in different industries. But we do image processing software and plug-ins for editing systems and Photoshop. We do simulations of optical glass filters, like Tiffin ProMist filters and various grads and things like that for photographers, you know,
Starting point is 00:33:13 that are kind of heavily used by photographers. And so we've had a product for quite a while called DFX, which is a whole suite of like a couple thousand of these filters. And, um, I, in, in over the last two years, I've, I've ported all of that code, which used to be a, um, uh, template, very heavily templatized eight bit, 16 bit, um, CPU based code all over to open CL, um, and, uh, floating point, um, lambda functions in C++ as a software fallback with everything rendering in full float. And that was a really fun project, kind of taking that, because that code base was probably about 15 years old,
Starting point is 00:34:01 going back the furthest and getting a chance to sort of re-architect it and rethink it in sort of modern terms. And right around when I started working on it, C++ 11 came out, and I just kind of fell in love with the lambdas immediately and the new threading stuff, and it was just perfect timing, and that kind of facilitated being able to rethink about your image processing, you know, operating on pixels in a completely different space like OpenCL, where you need to think about it at the pixel level. You know, every pixel is its own little compute thing, you know, engine in OpenCL. So that was a lot of fun.
Starting point is 00:34:42 And then I actually got to the point where I was able to write a filter or rewrite a filter in OpenCL or in C++ and literally take the same piece of code and just move it between OpenCL and C++. Because the semantics are so similar, and I had class names in C++ that matched the OpenCL versions, and it just worked. It made it really nice. So is that code also multi-platform?
Starting point is 00:35:09 Yes, it's 100% pure C++ for the low-level image processing code. And OpenCL, of course, which is more or less portable. Yeah, that's what I was going to ask. Have you had any portability issues with OpenCL? Because I've never used that. Yeah, I mean, you're kind of limited. You know, you have to run on, there's a CPU driver on Intel and AMD. And then otherwise, if you're on NVIDIA or AMD card,
Starting point is 00:35:35 you can run on the GPU, as well as on Intel systems. We tend to, you know, kind of, we kind of target the NVIDIA and the AMD, AMD boards, um, cause they're a lot faster than Intel right now. Um, and then if, if we don't find those, we kind of fall back to the C to a,
Starting point is 00:35:54 to the C plus plus version, which is uses open MP for parallelism and also some of the new, uh, standard thread stuff in C plus plus 11 for parallelism. Um, but that's all extremely lightweight, straight-up C++ code. And all that stuff is really
Starting point is 00:36:11 heavily decoupled from the user interface because I didn't want the cute stuff to kind of leak into it. Because we never know where that code's going to end up. We built an iPhone application called PhotoFX, which literally uses the same C++ code from DFX in it,
Starting point is 00:36:32 because it uses Objective C++. I don't know if you guys use that to kind of bind Cocoa and C++. Yeah, I've done a little bit of Objective C++. Yeah. I have not. It's such a nice just seamless way of integrating C++ and Cocoa together.
Starting point is 00:36:52 And it just worked and it's really fast. It's a fairly old product now. It's not using the new OpenCL. It doesn't run on iOS yet. So it's still all on the CPU on the iPhone and the iPad, but it's really fast for what it does.
Starting point is 00:37:09 But, you know, having that pure C++ code made it easy just to get it running on there. You've mentioned threads and you've mentioned your love of lambdas. Are you just throwing
Starting point is 00:37:23 lambdas to std async and getting back futures and letting all the magic happen or what no i'm not i haven't gotten that advanced um i i try to keep it a little bit more lightweight than that i most of my threading for this kind of is i'm doing scanline level threading so if you're working on an image, I kind of just run like a... It's basically a fancy parallel four. And I noticed on Windows or on Visual Studio that OpenMP
Starting point is 00:37:53 was still a little bit slightly faster, so I wrote a wrapper for parallel four that basically falls back to an OpenMP pragma on Windows. But as of Xcode 6, they clang drop support for OpenMP pragma on Windows. But as of Xcode 6, they clang drop support for OpenMP, or Apple hasn't brought it back in yet,
Starting point is 00:38:10 so I was forced to kind of dump all my OpenMP pragmas and go to a parallel 4 kind of thing. But I kept the OpenMP stuff on Windows. But I did it in a way where it's just done with a helper function, you know, that doesn't, so it doesn't look like a, you know, there's no if-defs and things like that in there. My model for the new,
Starting point is 00:38:32 for when I did this big rewrite, was to make everything super, super clean and get rid of all the old crap and gunk that I had come up with 15 years ago. You know, how co-bases can get after a while. I wanted to interrupt this discussion for just a moment to bring you a word from our sponsors. You have an extensive test suite, right?
Starting point is 00:38:51 You're using TDD, continuous integration and other best practices, but do all your tests pass all the time? Getting to the bottom of intermittent and obscure test failures is crucial. If you want to get the full value from these practices and And Undo Software's live recorder technology allows you to easily fix the bugs that don't otherwise get fixed. Capture a recording of failing tests in your test suites and debug them offline so that you can collaborate with your development teams and customers. Get your software out of development and into production much more quickly, and be confident that it is of higher quality visit undo-software.com to see how they can help you find out exactly what your software really did as opposed to what you
Starting point is 00:39:31 expected it to do and fix your bugs in minutes not weeks so do you want to tell us a little bit about um you mentioned when we were talking over email before the show that uh you were embedding python into silhouette do you want to tell us a little bit about how that works oh yeah well You mentioned when we were talking over email before the show that you were embedding Python into Silhouette. Do you want to tell us a little bit about how that works? Oh, yeah. Well, first of all, I love Python. I kind of discovered it back in about 1996 when I was working on Silicon Graphics machines.
Starting point is 00:39:59 They actually added a Python interpreter to their debugger, SGI did, and that was their way of letting you write your own data formatters. And I thought that was amazing, and they showed examples of this language Python. So I've kind of had a love affair with Python ever since then, and any big enough application that offers any amount of extensibility that I write, I put a Python interpreter into it.
Starting point is 00:40:22 So Silhouette, the number one thing for Silhouette, it's used by these visual effects artists, and they love to customize everything. They want to remap every single button. They want to have a button do five things. They want to make buttons do things on Thursdays that don't happen on Tuesdays.
Starting point is 00:40:39 These guys want total control. Back in Silhouette version 1, back in 2004, I'm like, well, we need to have a way of binding keys. And I thought, let's just put a Python interpreter in here and make a keybinds file, and then let people, they can just go in there and just edit this file, and it's Python code, and it does all the bindings,
Starting point is 00:41:01 and you can bind a keypress to a function that calls 10 other functions or whatever they want to do. And so that was sort of the impetus for doing that originally. And then eventually I started building parts of Silhouette in Python and offering hooks and things like that. So if you do this thing in the user interface, it calls out to Python, and users can register hooks to do things. And they get pretty advanced things,
Starting point is 00:41:27 like if they start a render, and then for every frame that gets rendered, it can call a Python script that they've added in, which then can then take that frame after it's rendered and go put it into their asset management system. Or they can fire off a background render on a server, or they can do all these other things. And we've even had some studios set up complete environments where they have these shot management
Starting point is 00:41:48 systems where an artist might be sitting there and a shot might come in and he's got a little console and it's like, hey, you're supposed to do the shot. And he hits a button and it literally sets up the entire environment for them, creates a silhouette project, brings the media in, sets everything up where they can just start working and they don't have to go through the process of file new everything up where they can just start working. And they don't have to go through the process of, you know, file new, import media. You know, it just does it all for them.
Starting point is 00:42:13 And so having these open accessibility layers to it is really beneficial. And that's all done with Python. Kind of sounds like you have a fair bit of your GUI accessible to Python. Yes, well, so I'm working on the newest version now, and that's one of the goals there. We have a lot of people, you know, PySide, they want to actually add their own docs
Starting point is 00:42:34 and their own dialogues and start doing... And so now we're able to add that functionality on there. In the most recent version, in the previous version, I added a... We always had a little Python console, so the Silhouette console that comes up, right, you know, logs everything, and you can hit enter in there and get a Python prompt, and it's just start typing Python code in there.
Starting point is 00:42:56 And then I added a script editor, which is a doc that was just always there, and you could have a Python script and then execute it by hitting a button. So then I realized, oh, I can literally use my script editor to build new features for silhouette and then when they when i get them working i can just take them put them into a file and then have it you know create a menu item or something and so a lot of the functionality now is completely driven by python and the best part about that is it's wonderful documentation. I don't have to document
Starting point is 00:43:26 the API so much as be like, here's the rendering loop. If you need to go make a custom renderer, just go see how this works. And fortunately, in this world, these guys are used to that. They've got whole teams of people that can just, if they have
Starting point is 00:43:41 an idea, they can just start looking through your scripts and figure it out. It's great. So it's really kind of a joy. Every now and then we'll get a customer emailing like, I need to get access to this in Python. I'm like, oh, that's great. And I wire that up and send them something.
Starting point is 00:43:59 I love letting users be able to extend my application in ways I never thought of. It's awesome. Do you use any tools to help with that process of wiring new features up? So, yeah, I was wondering if you were going to ask that, because you mentioned Swig earlier. Yeah. And I'm...
Starting point is 00:44:17 I don't know, maybe this is a flaw or what, but I'm kind of an old-school programmer. I don't like magic happening. Like, I never liked interface builders. I don't like magic happening. I never liked interface builders. I don't like code generators. I like to know exactly what's going on. So I write all my Python bindings by hand. And I looked at Boost Python,
Starting point is 00:44:39 and it just seemed so big. I'm a big fan of lightweight. And so I wrote a couple of little simple wrapper objects that made it easier. And if I need to expose a new function, I just go pull up the appropriate file and implement the function and put it into the function map. And then I'm done. It's something that I've just gotten used to. And I'm sure a lot of people would be like, oh, you're an idiot for doing that. But it's a habit I've gotten into.
Starting point is 00:45:07 And it kind of lets me rethink. One of my philosophies is I don't necessarily want to expose exactly my internals to my users. I want to give them a nice, clean interface to what's going on under the covers. And so when I expose some of the object model to Python, I try to rethink it in terms of how, if I were a user writing something, how I'd want it to work, and not necessarily how it's implemented. And so I kind of rethink the interface,
Starting point is 00:45:35 and then it gives me the option of actually re-implementing that interface in a completely different way that may be a lot cleaner or more extensible or whatever. And I think with a binding generator, it would end up just making a lot of these bindings that I wouldn't necessarily want.
Starting point is 00:45:54 Yeah, it goes either way. Sometimes you end up writing your C++ code so that it generates the bindings you want. And that's not necessarily what you want to do either. Yeah, yeah. I guess that's one of the kind of cool things that Boost Python can do, right? You can kind of annotate your code at the C++ level, and then it can just make a dynamic binding for you.
Starting point is 00:46:17 And I think that's pretty cool. It's one of those things where the learning curve for it compared to what i had time to implement and since i already knew how to make python bindings i just i felt more comfortable with it it's always the case you know when we're a very small shop and it's every single decision we make it's about how long is this going to take to implement and sometimes it's like well i can get something working you know that isn't the most elegant way tomorrow. Or I can go research this thing and take a couple weeks and then maybe it'll work.
Starting point is 00:46:47 And a lot of times we just do the, well, let's just get it working. So correct me if I'm wrong, but the Python interpreter has a global static state, right? You do not pass around a state object when you make interpreter calls. Yeah, that's right. It's literally starting up the interpreter
Starting point is 00:47:09 with PyInitialize, and then it blocks. We don't have any threading being done in Python. It's almost all of it's triggered from the UI anyway. It's all in the main thread. And so that really hasn't been an issue. Yeah, that's what I was wondering. Threading, or if you had any other problems.
Starting point is 00:47:25 Yeah, I mean, that may be potentially a problem if we ever decide to do a custom node that can do rendering with Python. I was in a company back in the early 2000s called Profound Effects, and we did a plug-in for Adobe After Effects, which was called Useful Things, and it was a plug-in that was literally a C++-based graphics renderer. Scanline renderer could do text and fonts and 3D and all this stuff and particle systems. It was all written in C++.
Starting point is 00:47:53 But everything else was written in Python, including all the effects. And you could add effects to it at runtime. And, of course, we couldn't multi-thread because it was not really thread safe. And so we had to lock on every call. And that's just fortunately one of those issues that I haven't had to really deal with. And nowadays, everything is so multi-threaded, like all these effects systems.
Starting point is 00:48:18 And they're very thread happy. And some of them will render multiple frames at a time. Some of them will render a frames at a time. Some of them will render a single frame in chunks simultaneously. I break things down more granularly in doing scanline-based threading when I can.
Starting point is 00:48:35 I like the fact that I'm working on one frame at once. It's easier to keep in your head that way. But yeah, if you threw something like Python in there and had to do rendering at the time it would be kind of a pain to deal with so uh you run in the gui thread have you had to deal with the issue of you know one of your users like accidentally putting a infinite loop in your python script you know like um if if if one of them did it they had they didn't tell me i'm okay i'm sure
Starting point is 00:49:06 i'm sure that's probably a possibility that and i'll say it again the one nice things about our our users in the visual effects post-production world is they're really smart and they don't like they're not it's kind of interesting like Like, you know, we make iPhone software, and we make visual – you know, this high-end, you know, roto-paint stuff. And it's the iPhone software that's a dollar where we get all the people that have a problem, right? Right. And it's, you know, and we have these other guys who are like, you know, they buy this expensive software, and then you give them the Python reference guide and the SDK header, which is a bunch of headers with no example code. And you're like, have fun.
Starting point is 00:49:47 And they never talk to you again. And then later on, you get this great email that's like, well, we built this entire rendering asset management system on top of Silhouette, and could you add this one thing? And they send me screenshots. I'm like, oh my gosh, that's amazing. So it really is
Starting point is 00:50:03 a great, you know, I feel fortunate that we haven't had a whole lot of, of, of, of those kinds of complaints. That's awesome. Very cool. Um, do you want to mention the user group that you, uh, you attend or host? Oh yeah. So it's funny cause I, I, I i work at home i've been working at home since uh 2000 year 2000 and um i one day back in 2002 or three i had some kind of you know algorithm problem and i did i literally did not have anybody to go stand at the water cooler with or to call up and uh because i was the only programmer. And I'm like, you know what? I'm going to start a meetup group here so I can get together
Starting point is 00:50:49 and have a group of people that I can bounce ideas off of or if I have a problem, I can maybe meet for coffee and hash it out. So I started the, it's called the Madison Area Software Developers Meetup Group. And we started with about three members back in 2002, 2003, and now we've got about 1,100 members. Wow. And we meet once a month, and it's a very diverse group of developers.
Starting point is 00:51:14 There's a lot of web developers. There's a lot of embedded people. There's a lot of, like,.NET and C Sharp. I mean, people are just doing all sorts of stuff. It's kind of really hard to organize and find topics you know that it kind of applies to everybody there's only so much you can talk about you know get or
Starting point is 00:51:31 GitHub or you know source control or you know stuff like that but we do I mean it's also kind of a large social group too so there's a great number of also self-employed or people working remotely or just people that work at home and
Starting point is 00:51:47 kind of get together and talk shop. So I've actually met some pretty good C++ programmers here in Madison because of it. And so, I don't know, about 12 years after my first problem, I now have a group of peers that I can email
Starting point is 00:52:03 in the middle of the day and like, hey, want to get some coffee and talk over this hashing problem or whatever. So I kind of got finally what I wanted out of it. But now we've got a bunch of people in the meetup that want to even go further with it and start doing big panels and big events throughout the year. And so we've been working on that. Yeah, it's been pretty on that. Um, yeah, it's been, it's been pretty rewarding.
Starting point is 00:52:26 I've, I've met my, some of my, my best friends through this meetup now. Very cool. Very cool. Well, Paul,
Starting point is 00:52:34 I think we got a lot of information out of you. Is there anything else you wanted to leave us with before we let you go? Oh boy. That's a good question. I can't, I can't think of anything. I, I feel like I've just been rambling for the last hour and I hope, I hope that's a good question. I can't think of anything. I feel like I've just been rambling for the last hour,
Starting point is 00:52:46 and I hope that's not the case. Where can people find you online? Well, I don't really blog anything. You can find me on Twitter at fxtech underscore Paul. And right now that's pretty much it. Okay. And people can find Silhouette online if they're interested in that? Yep, silhouettefx.com.
Starting point is 00:53:08 And if they're interested in digital photography software, we have digitalfilmtools.com. Okay, very cool. Thank you so much for your time today, Paul. Thank you very much, guys. It was a pleasure. Thanks for joining us. Thanks so much for listening as we chat about C++. I'd love to hear what you think of the podcast.
Starting point is 00:53:24 Please let me know if we're discussing the stuff you're interested in or if you have a suggestion for a topic. Thanks so much for listening as we chat about C++. I'd love to hear what you think of the podcast. Please let me know if we're discussing the stuff you're interested in, or if you have a suggestion for a topic, I'd love to hear that also. You can email all your thoughts to feedback at cppcast.com. I'd also appreciate if you can follow CppCast on Twitter, and like CppCast on Facebook. And of course, you can find all that info and the show notes on the podcast website at cppcast.com. Theme music for this episode is provided by podcastthemes.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.