The Offset Podcast - The Offset Podcast EP035: 10(ish) DaVinci Resolve Feature Requests

Episode Date: July 1, 2025

It’s widely known that the Blackmagic Design development team for Resolve is one of the best in the business. - coming up with great new features and squashing bugs quickly.  But that does...n’t mean there isn’t room for improvement!In this episode we take a list of 50+ feature requests and cull that down to 10 (ish), touching on things like groups, potential interface improvements, and what AI could (should) be doing in Resolve.  Of course, anytime a list of 50+ things is paired down considerably there are good feature requests that get left behind so we might have to do a part 2 on this later in the year. In this episode some of the feature requests/improvements we take a look at include:Complete revision of groups in ResolveMore AI driven utility tools - auto conform, dead pixel analysis, auto grouping etcUI improvements - UI to dock/save presets, settings, DCTLS, favorite effects, momentary full screen panelsTracking & Stabilization workflow improvements including axis weighting, predictive tracking, depth based trackingGestural support for pen and trackpad usersColor management for saved/exported stills + contact sheet/webpage exportClosed caption, & Dolby Vision/Atmos support for DCPs. DCP validationInfo pallet improvements - shot stats, corrections list, more items available for smart filteringHow to make good feature requests and bug reportsBe sure to checkout offsetpodcast.com to check out our growing library of shows as well as to submit an idea for a future episode.  If you liked this or other episodes please consider buying us a cup of coffee to support the show: coff.ee/theoffsetpodcastThanks as always to our amazing sponsor @flandersscientific for the support!

Transcript
Discussion (0)
Starting point is 00:00:00 Hey there, welcome back to another episode of the Offset Podcast, and today we're talking about 10, probably a few more, to be honest, features that we'd love to see in DaVinci Resolve. Stay tuned. This podcast is sponsored by Flanders Scientific, leaders in color accurate display solutions for professional video. Whether you're a colorist, an editor, a DIT, or a broadcast engineer, Flanders Scientific has a professional display solution to meet your needs. Learn more at flanderscientific.com. Hey, everybody. Welcome back to another episode of The Offset Podcast. I'm one of your host, Robbie Carmen.
Starting point is 00:00:41 With me as always is Joey Deanna. Hey, Joey, how are you, man? Hey, Joey, so as we do, we try to think about some ideas and come up with some outlines and think about things. And it just so happened that this week I was winging about something in resolve. And you're like, oh, that's a good feature request. And it got me to thinking, you know, we've never done in the 30 whatever episodes we've done now. We've never really focused in on a features episode.
Starting point is 00:01:10 For those of our audience who don't know this, one of Joey's favorite things to do annually is go out to NAB where he is a demo artist on the DaVinci Resolve booth. And he's hard to miss and harder to not hear because he's in the middle of the booth with a big crowd around him talking about new features. and, you know, I thought, Joey, that with that experience, and plus my, our discussion earlier this week about something I'd like to see, we should just do an episode on this. So how about it? Let's dive in and talk about some of the features that we want to see. Now, to be clear, this is not a complete list. We did try to kind of consolidate things down, some general categories of things. And of course, if you guys have some feedback or some comments, of course, tell Blackmagic about it. But also, you can jump over to The Offset Podcast.com. and we have a submission form. And tell us about things you like to see. Maybe who knows, maybe we can turn it into a whole episode or a whole show idea.
Starting point is 00:02:05 Regardless, we always like to get suggestions from our audience about show ideas. So, first thing I'm going to say, Joey, before I talk about a new feature, is I just want to say that it is incredibly difficult and complex to make software these days, right? Yes. Especially a piece of software that has grown as big
Starting point is 00:02:26 and as wide as DaVinci Resort. has. And I just think before we dive into some of these feature requests that we'd like to see, I'd like to personally just give a big thank you to the development team at Blackmagic. I think that they do really, honestly, a tremendous job. I mean, they're on this year cycle of, you know, major new features, squashing bugs, et cetera. So if anything we say today sounds like critical and perhaps a few of the things are, in general, the sentiment is, holy cow, these people do amazing work, they should be praised, and software development is really, really hard. Yeah, I can't emphasize that enough. I've worked a lot with the DaVinci Resolve development
Starting point is 00:03:09 team at NAB and other conferences and events I've been to and presented at. And I'm not exaggerating when I say that they're the best software development team in the business. And they really do care what their users think and what their users need, but they also have to weigh those requests with, you know, how is a feature going to fit in the context of this massive software that has little, every page and every button has little legs that go everywhere in across a whole project. So it's a huge balancing act for them and for them to be able to consistently come out with, you know, new features that work really well, that are really fast, performant, and it just let us do our jobs easier every time they do.
Starting point is 00:03:56 a new release without really breaking the software, right? Even the betas these days are relatively stable, more stable than most betas should be. You know, so yeah, none of what we're saying here is officially endorsed. None of what we're saying here is anything but speculative and what kind of we feel would be good for our workflows. And none of it's made to be critical of the development team. And it's always just a good time just to revisit the kind of idea of, hey, if you're going to bug or feature requests, you know, it's a good idea to do those, you know, we talked about this
Starting point is 00:04:33 on other episodes, right? Just being like, hey, this is broken, fix it. That's not a good bug report. Or having or creating, you know, thinking about a feature request that affects you and your buddy is not a feature request worth submitting because it doesn't affect the larger whole wide world. And it seems important to you. Or it might break someone else's workflow. Right. It might seem super, super important to you. So I think as you go along in thinking about your own bug reports, your own feature requests, you always have to kind of put it through the lens of, okay, is this something that I can articulate and demonstratively show the problem or what I'm trying to do? And then two, when it comes to feature requests, think about, is this something
Starting point is 00:05:15 that I could get a whole bunch of people to endorse and like and need rather than just your particular niche case? But we've talked about that before. We don't need to go on about that. I want to start with my number one thing that's going to be no surprise to you because I gripe about this all the time. To be fair, I'm not the only one. I know. To be fair, I am a very heavy group user.
Starting point is 00:05:39 I love groups, whether it's a narrative, whether it's a, you know, a documentary with interviews or whatever. I use groups all the time, love them. Your mileage may vary. A lot of people approach it in a different way, but I love them. And groups, honestly, have a lot of people. really changed in forever, if ever, since they were introduced. I would like, I would love to see a complete redo of the group architecture. I'll just give you the highlights of what I would like to see.
Starting point is 00:06:09 That would be drag and drop ability into a group from the light box instead of having to select and right click because how many times have you been going through the light box selecting and then you accidentally click somewhere off of a shot and it deselects all the shots? So drag and drop would be great. I would love the ability to be able to rename groups just right in the light box or right much, you know, just by clicking on the name rather than I have to do this right click. Because I name groups. And I know a lot of people name like group one, two. I'm like guy with a red shirt, you know, woman with the blonde hair, whatever it may be. I'd like to be able to rename. I also, this one, I admit, is technically more difficult and complex. As you
Starting point is 00:06:48 described earlier, it reaches through more things. I'd love the ability for nested group somehow. I'm not exactly sure how that architecture would work. I think they've already sort of started thinking along these lines with like the layer stacks and stuff like that. So be interesting to see where they go. And then the other one, this is also very technically, I'm not sure about it if it works, but I would love to be able to do node graph swapping depending on which part of the group
Starting point is 00:07:18 you're on. And what I mean by that is that sometimes I'm like, oh, crap. I'm on pre-clip where I wanted this particular thing on post-clip. User error, totally, right? I can redo it. I could save a still. I could reapply it, all that kind of stuff. I would just like to be able to take any node graph and just tag it which part of the
Starting point is 00:07:38 group is. So it's customizable. If I create a group and I'm like, oh, you know what? I want this now. I didn't originally have groups. I now want this to me my pre-clip. I just want to be able to go, yep, now this node graph is pre-clip and it moves it over there. I know that's a little, again, a little technically challenging, but just in general, ground-up redo of groups, please.
Starting point is 00:07:58 Yeah, I'm imagining in the light box where you see all of your groups, one, a tree, almost like a bin structure, right? You can have subfolders of bins, so you have one group that has a node tree pre and post for all of the subsequent groups, and to be able to drag and drop and reorder to do a custom order and to be able to sort by name. sort by name, inverted, or whatever. Just basically anything you can do in a bin, I'd like you to be able to do in the light box with shots and with groups. But I'm going to take this to our next feature request because one thing that we've seen, honestly, Black Magic do better than anybody else.
Starting point is 00:08:44 Yeah, I agree that. Utility-based AI features. Local and mostly local, to be honest with it, which is also Exactly. You know, locally hosted models that are not doing generative stuff, they're doing utility-like stuff. Like the new kind of premixer thing, the music editor, stuff like that. Yep. Big feature request, top line, I want to see more and more utility AI stuff. What that has to do with groups, I'd like to see an AI thing that will automatically look at similar shots and come up with some group candidates. for them. Yeah, and I can see that working with like almost like a threshold, kind of like the old school scene detector used to work with, right? You know, it's going to go through and go,
Starting point is 00:09:30 I think these are all the same shot. What would you like to do? Just adjust a little threshold slider of some sort to include or exclude. Yeah, that would work great. But there's a ton of other utility AI features that I think both of us have talked about that we would love to see. And most of them are related to boring stuff, like conforms. Tevious stuff.
Starting point is 00:09:50 Yeah. sizing. I would love to be able to say, here's my timeline. Here's a reference imp before. Match all the sizing. Or even on a shot by shot basis. Match these two shot sizing. Or look at a reference and look at a pile of raw rushes and figure it out. And put them together. Just figure it out. That is such a I mean, I'm not saying it would be easy to implement or write, but it's a low hanging fruit for what the kind of AI technology that's available today would be really, really good at. Well, and I think, I think especially, so just to be clear, that top line thing for me in the AI category is the auto conform, however that would work.
Starting point is 00:10:30 And I think it's about time because, you know, competitors are starting to go down that rate. I know base light's been working pretty hard on that kind of thing. But also, as you said, some of the machine learning models for figuring out features and clips, you know, framing, that kind of stuff. our buddy Brandon Thomas done at TBD Post in Austin has actually been, you know, he's working on this whole conform tools project. You know, and, you know, he's kind of said, hey, like, this is an area that I think has legs
Starting point is 00:10:56 because, you know, these models and the tools are accessible. And, you know, I think a big development team could, you know, probably bang out those kind of things faster than, you know, you know, Brandon and his small team. But like, yeah, automatic UI, auto size matching. Speed matching, I think, would be a really complicated thing. in that process, so I'm not quite sure how that would work. I'll have a couple other things I'd add to that, too, the AI thing in general, though,
Starting point is 00:11:22 besides the auto-conform, Dead Pixel and other QC analysis, just like analysis. So, like, Dead Pixel Fixer, I don't know what last time you used it, but like I use it quite a bit for just hot pixels that happened in that kind of. And it works great, man. It really works great for a lot of things. But what I would love more is just even if it takes, it's a background process. You know, it takes 20 minutes to go through a timeline and analyze it. Fine.
Starting point is 00:11:46 I'm used to that anyway with like Dolby Vision analysis and that kind of stuff, right? Just to analyze it and then place all of those dead pixel fixers, you know, in those patches right there. Same, you know, same kind of thing that I think could be done with like blanking, right? Analyze this timeline for blanking and maybe there's fix it or don't fix it. Flag it, you know, only or, you know, maybe we've seen a lot of like scripts that detect blanking by looking at, oh, is that. this sizing outside of the bounds of the raster. Yeah, that works for some stuff, but what if the blankings baked into a shot?
Starting point is 00:12:21 Totally. 100%. You know, and I think that you could, I think that just in general, I think that the QC side of things is ripe for AI assistance, right? And I mean, I'm sure people have a lot of ideas of that. You could do even thing like, you know, title and action safe, you know, detection on titles,
Starting point is 00:12:37 center crop problems, whatever it may be. Like, there's a lot of ways that the AI, stuff could go. But like I'm just in general more, I'm stoked that they're focusing more on on that kind of stuff. 100%. So the other thing I was thinking about, and I know what you're going to say when I say this, because you, I mean, the one thing we can't fault you for is your consistency on this matter. So for those who don't know, Joey is a locked down UI evangelist, right? He advocates passionately, and strongly, which for reasons I'm about to tell you in just a second, for AI lockdownness,
Starting point is 00:13:19 right? Which to a certain degree versus other app. U.I. Not yes. I'm sorry. Wrong eye. U.I. Lockdown, which to a certain degree I agree with this, I have some issues with it. My thing is that I think there's a happy medium to be had here.
Starting point is 00:13:32 And let me give you my use case. I, in the past year, recently moved to a dual monitoring setup in resolve, right? because I was just, you know, single monitor forever. I was just like, let me give this shot. And I've got used to it, honestly, because again, a big keyframe graph editor, got a big media pool window. I can have scopes up, whatever.
Starting point is 00:13:54 I am not advocating floating windows or anything of that nature. I am advocating a system more like the fixed docking kind of like system that Omniscope uses, right, where you have regions of the screen to dock things and place things, standard things that you can place in those preset locations. I think floating windows is a bad idea for a lot of reasons. I mean, personally, I've come along to that school of thought. But like, if I want to have in my dual screen layout, I want to have my scopes at the top of
Starting point is 00:14:29 that second window versus the bottom of that second window. I feel like I should be able to just dock them up there and rearrange things within that frame. Does that make sense? So I will... acknowledge that that is a thing that many, many people want. And a lot of people want it to work like Premiere or After Effects works where it's completely dynamic.
Starting point is 00:14:51 And I am 100% against that. Anybody's ever used Premiere knows how quickly and easily it is you to open up a tool and you completely grenade your entire UI. And now your timeline is full screen height and like an inch wide and useless. I agree. So any kind of UI customization and resolve, I think. needs to have huge amounts of guardrails associated with it. And yeah, okay, I'll give you this.
Starting point is 00:15:19 There's a couple times I would like to be able to say, okay, I want this panel here instead of here. But mostly I mitigate that with shortcut keys to turn panels on and off to get more or less real estate. And that's worked really well. The only UI thing that I would like to see, and it's something that has existed in Legacy Fusion, It's existed in other systems I've used as well that I won't mention.
Starting point is 00:15:46 You have a DS. Is I would like a momentary and Premiere and After Effects have it too. A momentary maximize of any given panel. Oh, like the TILA key. I would like to be able to hit a hotkey, make the current panel full screen, ignoring everything else, and then hit that hotkey again and have it go back to where it was. Yeah, yeah, yeah, yeah, I'm sure. That doesn't allow us to rework the whole UI and break everything.
Starting point is 00:16:10 but when we need a crap tonne of real estate for a single task, boom, one button. That's my one UI customization thing that I would like. I still don't want the panels to be rearrangeable, but I get it. I get it. I just think if they go down that road, it's got to have guardrails to stop you from breaking it like you can. I agree with you. I agree with you.
Starting point is 00:16:32 And the guard rails I'm talking about is a docking system kind of like an Omiscope thing where you are, if you're going to have it a window open or a pallet, open, you have to put it somewhere within a preset framework, right? But even Omniscope doesn't have this. It doesn't have minimum sizes. So you can't, you can easily make things in an aspect ratio or a size that's completely squished and useless, which is the problem with Premiere you get all the time. Yeah, and you're right.
Starting point is 00:16:59 They could put some guardrails on that for a thing too. But I will say the one. It's very nuanced. The one related thing, though, that I'm, I think I, I think I advocated this in the episode that we did about OFX and DCTLs is that I would, and this goes again, contrary to what you're just saying, but I would love a DCTL favorites palette. What I mean by that is somewhere that I can, if I use these three DCTLs all the time, right, instead of having to have them in the node tree or have them, you know, in a preset, you know, power grade or something like that, I would like to
Starting point is 00:17:37 just be like, nope, these are always active, always usable. The second I grab one of those controls, it's going to apply it to the node. I'm on. Like, there's some complication about how it would get applied or whatever, but I would just like them there. And have those be the default, not behind seven different buttons on the panel, have those more forward somehow. I'm going to take this to another level because I think you're not thinking
Starting point is 00:18:00 fourth dimensionally. Yeah, I do. I think resolve needs a complete. rework of how it manages presets. And I don't mean like the gallery. I think that's great and works as it is. I want a separate preset management function. When I say preset, I mean basically a bin or a folder that's accessible everywhere in the software that can save any combination of settings.
Starting point is 00:18:29 Is it a stack of 10 edit page effects that you can then drag in and say this is my base for something? and now you can grab that and put it back in. Is it an OFX plug-in? Is it a DCTL like you said? But I want to be able to say, okay, take this node and I've got a DCTL on it. I've got everything set in the controls a certain way. Save just that node. Put it in my preset folder.
Starting point is 00:18:53 Now I can just hit a key. Preset browser comes up as a momentary thing. Grab that node, throw it my node trick. Hit the key again, preset browser goes away. This is something that was available. Basically, DS worked like this and Quantel worked like this in that you had an omnipresent bin. I'm calling it a bin. It's not really a bin.
Starting point is 00:19:18 It was just a window that you could say unlimited settings too, a setting for anything. It could be just a single DVE. It could be move the, it could be like scale at 50%. Just that one aspect. Drag it on. It applies just that. Or it could be a whole set of this node has 10 different things. already preset on it. Anything from
Starting point is 00:19:39 one control to infinite controls saved as a single preset that you can just get to in a project-wide or database-wide library. Yeah, and you should clarify, though. I think that's a little bit different than what you can do now, obviously, with like a power grade and a saving
Starting point is 00:19:55 single node power grade. Like similar results, I guess I like that. Yeah, and you can kind of fake it with that. I would like something a little bit more deeply integrated in the software across all the pages. I agree. I agree. I guess I guess the subtle difference for, I think that's a great idea, but I think the subtle difference that I'm asking for is the, just like we have, you know, whatever, Magic Mask and the Windows palette or whatever, I'm saying that there should be, I feel like there should be a place for user definable stuff that I can always have open and ready to go. I, you know, I guess just like any other tool, it would act like if you made some adjustments in that, like let's let's say it's a DCTL and you have it. docked, it would just apply that automatically apply that DCTL to the whatever node you're on,
Starting point is 00:20:41 just like a window or a key or whatever does, right? So no, I think those are good ideas. The other one, this is a big, oh, God, I drive myself crazy with this. And I'm not sure how to solve this, to be honest with you, because I think it's, I suspect it's a slightly more complicated thing that I'm complaining about. And that is, I cannot tell you how many times a day I'm like, oh, I'm going to go hit track, right, on my stream deck or my panel or whatever. And I'm like, oh, I was on stabilization, not on window.
Starting point is 00:21:15 Reset stabilization, go back over to the window tab. Now track the window or vice versa, right? So I'm not sure how we could get better at this, but I'm just going to beg and plead somehow make that switching back and forth a little bit more intuitive, aware. I don't know how to phrase it, right? But like, I've got to solve for that. And that is just break the keyboard shortcut out into track stabilizer and track power window. That could certainly help.
Starting point is 00:21:48 So if I'm just hitting that track button, it's not just tracking no matter what. And you can keep the existing keyboard shortcut for overall track. So the current behavior could stay and you could add additional shortcuts that are dedicated to the two different tracking modes. But, you know, another thing that I've heard a lot of and I think could be really, really improved a bit is now don't get me wrong. The tracker's amazing. Everything about the tracker is amazing. It's very easy to use. It's very fast.
Starting point is 00:22:19 One thing I would like to see the tracker do. And full disclosure, this was not my idea. Somebody came up to me on the show Florida NAB this year and explained that some other software, I forget what, could do this. And I was like, oh, that actually makes a lot of sense. When a window or a tracking point goes off screen, it should default to just following the inertia of the direction it was in at the speed it was going. Can you imagine how many clicks that would eliminate? It's like a predictive movement? Once a window gets close enough to the threshold of the edge of the screen, just by default, keep going at that rate.
Starting point is 00:23:00 So if it comes back on screen, it's already kind of moving with it? No, if it comes back on screen, you'd still have to go and readdress in keyframe just like you normally would. Well, how many times you had a power window where somebody's face goes off screen, it stops at kind of the edge of the screen, then you manually have to go in and keyframe it to go off the rest of the screen. Totally. I would like to see it just keep that inertia going. And then if it's wrong, we can do keyframes just like we could. Or, I mean, I guess semi-related, right? It happens all the time where you have higher resolution footage than Raster, right?
Starting point is 00:23:34 It seems like, you know, you could have an option that says, you know, track source, not track, you know, timeline. Oh, I think it actually already does that. If you're zoomed, like if you got input sizing turned on and you're zoomed past your raster, it will track past that if it has information. Oh, okay. I got you. That already works. Okay, that makes sense. One last tracking feature request, the new stabilizer, fantastic.
Starting point is 00:23:58 Love it. I say new. It was new in like version 14. But we lost one thing that the classic stabilizer had, which was individual axis enable and disable. Oh, right. Yep. I would like the new stabilizer to have an axis mixer. I would like to be able to say, for example, on pan tilt scale, just do your normal stabilizer. Rotate, lock that down.
Starting point is 00:24:27 Lock the horizon. Or do a stronger stabilize on rotate versus pan, and tilt, things like that, or do less of a stabilizer. Oh, I see. Yeah. On the shot, right? So I've actually gotten around this a little bit in the past by, in fact, it just came up on a show I'm working on right now. I had one shot where it was rotating really, really bad, but it was a walking shot.
Starting point is 00:24:49 So there was a lot of movement. So I couldn't just turn the new stabilizer on and say lock camera because that would get rid of all the movement I liked. So I stabilized it position only on the new stabilizer, rendered that. in place, went to the old stabilizer, then did rotation only, and locked the rotation axis afterwards. A little bit of a workaround. I'd love to see axis-based controls on the stabilizer that I can dig into and customize my stabilization. I like that. And it just reminded me of one thing that is a little pie in the sky because I think computationally it would be ridiculously slow and probably not worth it. But that would be with all of the efforts,
Starting point is 00:25:32 placed into depth mat and magic mask, et cetera, it would be cool to have stabilization separated for depth as well, right? So like you could apply, you could apply a little more stabilization to the person or whatever in the foreground,
Starting point is 00:25:48 right, and still have that background a little while. Because like the problem that you run into all time is that like you can get something fairly stabilized, but then you're like, oh, I noticed some weird, you know, warping glitchiness going on in the background or something like that. It'd be cool if you could separate the shot by depth and then stabilize at different levels based on that depth map.
Starting point is 00:26:10 Again, computationally, I think it's like that would be horrendous. There's no way that's going to probably work in real time. But like, it might be a cool idea. Yeah. Now, this next one, though, is something that I think is you have been asking for since, I mean, 15 years that I've known you. Joey and I are both big pen users, Waycom, Zene Labs, whatever it is, right? We are on those.
Starting point is 00:26:39 I have a little less Mounts Hatred than Joey does, but once you get used to the pen, it's fantastic. And I actually just got the new versions of the Intuos pros. I had been on the Zene Labs ones for a while. Went to the Intuos Pros. I think they're great. But you've said this forever. we need to have more gesture-based options.
Starting point is 00:27:02 And I was thinking about that. Yes, that's true for pen, but that could also be true for laptop users, right? Where, you know, various three fingers, pinches, whatever, you could do gestures, you know. And I'm recalling Paintbox, Henry, you know, the good old days of tablets where, you know, you make a, you know, a swipe to the right, it plays.
Starting point is 00:27:22 You know, you do like a J-Sigma, it does something else. Like, whatever it may be, I think for people who are really heavy, pen users, that ability to do gestures would be huge. Yeah, I can obviously go into a little bit more detail on that because I am the biggest advocate of the pen ever. I don't own a mouse anywhere. I'm looking at my desk here.
Starting point is 00:27:42 There's no mouse on it. My assistation desk, there's no mouse on it. I don't have it. There's not a mouse anywhere in my office. Never has been, never will be. They are useless devices and all they do is bring pain and heartache. And I hate that. The pen is the best
Starting point is 00:28:00 Human to computer pointing interface That has been come up with so far When maybe we'll have neuralink someday and we won't need the pin The pen is what we have And resolve in general works very well with the pen But there's inconsistencies The first thing I would like to see addressed is I want Middle click to drag around
Starting point is 00:28:22 You know you can middle click and grab your timeline And scroll around with the pen Middle click The light box You can middle click bins and do that. I want that everywhere in the software without exception. And also not just middle click, but if you do a middle click and press down on the pin, that'll turn it into an interactive pin-based zoom. That works in fusion.
Starting point is 00:28:44 It works in the color viewer. For some reason, it doesn't work in the timeline. It doesn't work in a bin. I would love to see that, especially now that as of Resolve 20, we have these really cool new kind of freeform bin arrangements where you can put. clips anywhere in the bin, we're already getting more pin-friendly there. Gestures, I would like to see them work better for numeric entry fields.
Starting point is 00:29:09 And what I mean by that is, if you're in the inspector and you drag a sizing and you get to the edge of the screen, you're done. You've got to pick up your pin and move back. Whereas a mouse, you can kind of keep holding the button down and moving it again, and you can kind of keep dialing it in. The way, it's on my mind because full disclosure, Everybody that's listened to the show
Starting point is 00:29:30 He's about to say the way it worked in the DS, the way it worked in the DS, right? This is, yeah, almost. We should have like a drinking game about this. I am a collector of vintage computers, systems, etc. And I just added to my vintage collection this week a big Quantel Pablo Neo.
Starting point is 00:29:49 And insert picture here of Pablo Neo in Joey's basement, right? The way Pablo worked with a pen with numeric fields is you could just kind of twirl them. Yeah. You just give a little twirl and you can get that means you can go infinitely right or infinitely left no matter where your mouse cursor was. I'm not saying they need to do exactly that. I'm just saying I'd like to be able to drag the size when the size number in the inspector without moving an inch off the screen and then being out of real estate. Yeah, I think in general though
Starting point is 00:30:19 I agree about numeric entries. I am constantly frustrated by the numerical entry. Like just even with depend just getting into them sometimes. Like you're like, you know, your whole bunch of extra clicks to kind of get it. I think that behavior in general could improve for sure. All right. Another little pain point one that I think that is on a lot of people's minds is, right,
Starting point is 00:30:42 in the day and age where we live where everything is asynchronous and you're posting, you know, videos and stills or whatever for clients. The color management aspect of saved stills, I think could be an important one too. right now there's there's nothing that happens they just get exported as you know you know whatever srgb or whatever this in particular it's not so much a problem for sDR stuff there's some ways of working around it if you process them in Photoshop or whatever HDR stuff it becomes almost
Starting point is 00:31:10 impossible right you can't really show somebody like that so i think just a little um some sort of in that in that export stills dialogue some sort of option okay i'm going to tag the still this way or i'm going to embed this color space a la you know how Photoshop or other graphic tools like that kind of work. Some sort of implementation like that I think would go a long way. Yeah, you know, we talked about this in every episode of HDR that we've ever seen. Apple has done photos in HDR on all their devices. And it's really helped the penetration of HDR across the consumer landscape.
Starting point is 00:31:43 That would be a great thing to be able to export HDR stills, send them to a client on an iPad, on an iPhone, and have them come up in a pretty color-accurate HDR way. Totally. Yeah, totally. Um, the other thing I would love to be able to see, and this seems again, a little of a weird request, but it's something I think could, uh, is to save a few steps. If you're doing, you know, a look creation thing where you're, you know, you're trying to give people a lot of options, just the ability to very quickly, like, if I select, you know, 12 stills in my, in my gallery, right? I would love to just be able to put those into like a contact sheet or, you know, some sort of, some sort of like, you know, grid. or something like that that I can export.
Starting point is 00:32:29 You know, you could take a bonus, another, you know, a bonus in that, another feature, you know, that's seen in a lot of other tools, is, okay, fine, I do a contact sheet. I'm just going to make that as an HTML too, right? I know I'm getting a little out there. But, you know, hey, that way I can just push that HTML on my website, and bam, my gallery for the client to choose some looks is done. I think that kind of thing would be relatively low-hanging fruit,
Starting point is 00:32:50 not super complex, but color management, and just a little bit better handling of multiple seals. I also, this is just a really big, a big bugaboo of mine. The ability to turn off the DRX part of the export, right? If I export, if I export 50 sills, I'm exporting 50 sills for a reason other than trying to move them to another system, right? I don't need the DRX.
Starting point is 00:33:15 Just don't do that or give me the option to turn that off, which drives me crazy. All right, moving along. another thing that oh this one's come up a lot recently right so I think resolve for the normal operator the independent filmmaker the you know the small production company you know they've changed the game on a lot of levels but one of the things that's become a lot more streamlined is DCP creation right instead of having to go and to you know render out you know a lot of intermediate files or whatever and get super complicated and buy another whole set of software
Starting point is 00:33:48 you can render out DCPs but a couple problems with that one closed caption and open caption support in DCPs is almost becoming a requirement these days from film festivals for accessibility reasons, etc. It is closed captions right now. Can't do it. Can't do it in a DCP and resolve. So I would like to see the ability to do those simply DCPs with closed caption support. Also, I don't think Dolby Vision DCPs work in Resolve either.
Starting point is 00:34:19 I still think you have to go out out to a third-party tool to do that. So I think just- Although that one's pretty niche. I think absence is part of the Simpty standard. It should be in there. I agree with that one. The other one that would be great, awesome, but I don't think they'll, you know, they might depend on third parties or other vendor-friendly companies to do this. It's just some sort of level of some, like file verification on export.
Starting point is 00:34:44 And what I mean by that, it could actually, it could actually, again, let's tap the UI part. the AI part of this, right? It could even be things like glitch detection, artifact detection. Like, I could take a regular old pro res and run it through, you know, Resolves AI quality inspector just to make sure, because how many times do people are like,
Starting point is 00:35:05 well, I rendered something and there was a decoding glitch or whatever, right? That could work with normal files or for things like DCP. Okay. I have one more niche one, and then I'm sure you have a couple other ones too, but this one is really, super dorky, but I want to explain it in full because why I think this is needed.
Starting point is 00:35:25 If I say, and this is just inside baseball, but I'll say to Joey all the time, how much, how many hours did you have in on that? I think it was like two, right? In reality, in the back of my head, I know that Joey has been working on this for seven hours or something like that, right? I would love the ability to have shot stats. and what I mean by that is a bit of a summary that is this shot has you know you have been working on this shot for X amount of times or even like even information about the shot right instead of having to look at go hover over every single node oh crap there's some compound nodes in this node tree or whatever I would just be like here are all of the color correction effects and ofx etc used on this clip and just in a list and be able to see it and I could see, like, we sort of have the info palette now.
Starting point is 00:36:20 I guess what I'm asking for is that, like, that info palette just be souped up a little bit with some more additional information. Yeah, no, I'm with that. Any kind of added metadata, I think, is useful. But the can of worms that opens up is the next thing I would say is, well, if we have all this information, I want to be able to smart filter with it. Oh, totally. And smart filters have, smart filters have gotten better.
Starting point is 00:36:47 But there's a lot of things that you can't smartphone. And so I think that's just, you know, hey, we got to tag those in the API and open them, to open those up for it to a lot of people to do it. That's fine. One of the thing I did think of that is, again, along those same lines of information, is, you know, with more and more tools becoming color space aware, right? You know, the default behavior is to operate in your timeline color space. your working space for those tools.
Starting point is 00:37:17 But sometimes it's, you know, for various reasons, you might switch, you know, there's that crowd of people who like to work linear or whatever, whatever the case may be, right? It would be great if somehow color space-aware tools were a little bit more, they advertised a little bit more what they're doing in terms of, hey, okay, right now I'm in,
Starting point is 00:37:36 this is in 709 or this is in P3 or whatever, something like that to advertise what it's working on. And my last, one, and I'll see if you have any other ones, is I think there's a lot of garbage that could be stripped out of resolve. Is that too harsh? Right? They've done a good job over the years of deprecating some things, hiding some things,
Starting point is 00:37:58 et cetera. I think that the application has got to this point now where it used to be where I could tell you just like, oh, that's on this page of preferences in the bottom and it's this checkbox, right? Now I'm just like I can't off the top of my head tell you a lot of times where something is because it's in this contextual menu, that contextual menu, whatever
Starting point is 00:38:22 right? I just think that the devs could do a you know legacy cleanup every three, four, five versions. I get the reasons not to take things out. I get it, right? There's always going to be that one person that's like I'm opening up this project from
Starting point is 00:38:38 seven years ago and I need to maintain compatibility with you know Da Vinci Aces from 2010 or whatever, right? Like, I get it. But at some point in time, things become bloated. And what I just do not want to have happen and resolve is I don't want to have it become like, you know, the old jokes about the Avid where everything was a preference, right? And there was, you know, 400 feet, you know, like, I don't want that.
Starting point is 00:39:01 You need to take away things at the same time. It's tough. I think if you ask the developers, they would all have a list of things. they would love to remove from the software as well, but it's a balancing act. And in general, they do a good job of it. You know, the place it's most apparent is in the resolve effects and the open effects and the tools because all of the legacy versions of every tool
Starting point is 00:39:26 are still in the software available for when you open an old project. Because you don't want to open up an old project and have your film grain or your noise reduction look magically different, even if it looks better, right? Do you still need to be able to render out that project and have it, match. So you might see a legacy on your OFX or something like that. This actually brings up an important and I think this is what I'm going to close on. This is a tip for people that don't know that this is how resolve works.
Starting point is 00:39:59 Every major release of resolve, you should take all your templates, your node trees, your presets, throw them in the trash, and rebuild them from scratch because they need to be backwards compatible on all of the effects for consistency reasons of opening old projects. So what happens if, you know, in one version, they make film grain twice as fast, performance-wise. If you use the same preset from two years ago, you'll never see those performance gains. So rebuilding your presets to get with the latest version of every effect. Yeah, or like maybe like some sort of like performance improvements.
Starting point is 00:40:38 Yeah, or maybe some sort of like, you know, like switch on like on that like you know it looks so if it's a preset that you load it's going to load with the version that you use at that time but there's a little switch to be like you know promote a promote to the latest and greatest or something like that you know that that could that that could definitely be helpful yeah man I mean I think these are all all pretty decent you know they're not obviously we're not exhaustive here I think we could go on for pretty much ever talking about the ones that you know things that we'd like to see and and who knows maybe uh maybe they'll implement a few of these.
Starting point is 00:41:12 But yeah, I think these are good. I like all the ones that you had. I think, you know, again, our viewers out there, if you have some, please, you know, add them to the comments wherever you're checking this out or drop us a line at Offset podcast. I do think, again, it would be really, really, really important to just say that when you have a feature request, okay, or a bug report. And we said this at the top of this episode, you know, just making like a demand.
Starting point is 00:41:39 is not the way to do it, right? Like, you idiots, you need to make this feature because I want it. Like, that doesn't get things done. Now, I think that we've talked about this before, but why don't we just recap and end this episode on, what's the best way to do that in your opinion? Because I know that you have some pretty good, you know, you've done this a lot for a lot of different companies.
Starting point is 00:41:58 What advice would you give to submit good bug reports and good feature requests? This is something, honestly, you know, I'm kind of proud of myself for the, this. I take pride in this. I have a very good record, both in Resolve and other software, of getting my feature requests implemented by the developers. And that's not by accident. I think part of it is because I do have some software development background, so I know what they're looking for. But when you talk about a bug report, you don't want to just be like,
Starting point is 00:42:31 oh, have you fixed X, Y bug yet that is everybody knows about and you're stupid because you haven't fix the bug? No, if there's like a, if there's, there's no bugs in resolve that are just like, everybody knows this is a bug and they're just not fixing it to be mean, right? If you have a bug report, the first thing you need to do is you need to be able to clearly describe it. And then you need to be able to step by step, recreate it. Click on this, then this, then this. And this happens when the behavior I expect is actually that. And what you'll find, nine times out of 10 when you go through the process
Starting point is 00:43:11 of making the step by step instructions to recreate a bug, that's when you figure out where your operator error was and you realize it wasn't actually a bug. I've done it a hundred times. 100%. And also... And if you don't have that step by step,
Starting point is 00:43:27 it's not actionable for the developers. Yeah, and I think along those same lines with a feature request, I think that like, you know, one of the things I've always admired about, you know, people who are kind of straddling that world of like product engineering, product design and like, you know, actual development is the ability like, okay, if you have a feature request, like, not only should you be able to describe it in pros, but like if it's like a
Starting point is 00:43:51 UI motivated thing or a feature, like wire framing it to a certain degree can go a long way. If you're like, hey, look, this is my vision of how this would operate, including that step by step, like I would click on this button to open up this palette and then I would drag from here over to here, right? The more information like that, the better because then the developers can go, oh, I see where they're going with this. Well, actually, we already have, because one of the problems that always happens, right, is that there's a lot of things in development that have long pipelines.
Starting point is 00:44:21 And what I mean by, they might be setting the groundwork for something that is coming in two or three or four versions, but they need to build the framework and it's going to take some time, right? So, you know, they might see your feature request and go, oh, we didn't think about that. We're already working on something similar, but this is a good suggestion. And thanks for giving us how you think the UI would work or whatever. We can implement that back into the UI with our plans that we already have. So the more information like that, the better.
Starting point is 00:44:51 And the last thing I'll say is when you are doing feature requests, you've got to think holistically about other workflows. The developers of any software don't want to hear, I want this one particular thing that makes my particular file do this, right? Because that's putting a lot of development effort into something for one use case. So you need to be able to kind of think, okay, what could I ask for
Starting point is 00:45:17 that will encompass my use case, but also lots of other use cases and won't break other people's workflows? You got to realize that a lot of people use this software for a lot of different workflows and use it in different ways. So they can't just implement everything that everybody asks. So if you're going to ask for something, think it through, explain the motivation,
Starting point is 00:45:43 explain the how, explain the expectations, and the why. And I can promise you, the devs are listening. The devs are listening and they do respond to good detailed bug reports and feature requests. And it's where you build a moment. You also build up momentum, that kind of thing, right? If they know that, you know, oh, this person is always consistently giving us good actionable bug reports, good actionable feature request, there are some trust that builds up with that, too, so I agree.
Starting point is 00:46:16 All right, well, Joey, I think this is great. I mean, again, we could have gone on for hours here about things that we'd like to see. If you guys have your own ideas, again, please go on to the Blackmagic website and their excellent forums. There's developers who are there. I'm sure developers are, you know, from time to time viewing the various, you know, Facebook, Discord groups, whatever. Those are good places, but also just put some thought into it as we've detailed here about how those features and stuff would work.
Starting point is 00:46:40 If you have things that you want to add to this conversation, you can always go over to offsetpodcast.com where we have show notes and you can comment there. You can find the show on YouTube and all major streaming platforms, podcast platforms like Spotify, Apple Podcasts, etc. So for the Offset Podcasts, I'm Robbie Carman. And I'm Joey Deanna. Thanks for us.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.