Home

You’re Doing That Wrong is a journal of various successes and failures by Dan Sturm.

My Post-Production Workflow

Over the past three-ish years of working for myself, I've experimented with a number of permutations of my post-production workflow in an attempt to find the smoothest, most flexible path from dailies to delivery. And, while I've been able to settle on a consistent workflow over the past year or so, I would never describe it as smooth or flexible. Each step has technical and creative frustrations that keep me from being satisfied. Still, it's the best frustrating workflow I've put together so far.

Before we get started, I'd like to make it clear that this post is in no way meant to be a "how-to" guide for others to follow. My intention here is to illustrate the absurdly complex method by which I turn ideas into videos, while also holding on to the faint hope that publicly highlighting these pain-points may lead to potential solutions.

The Players

Currently, the high-level flow of applications I use in post-production looks like this:

  1. DaVinci Resolve v12.5 – Dailies
  2. Avid Media Composer v7.0.3 – Offline
  3. Apple Compressor v3.5.3 – Encodes
  4. Nuke Studio / NukeX 10.0v3 – Picture Conform, Online, VFX
  5. DaVinci Resolve v12.5 – Color Correction
  6. Nuke Studio 10.0v3 orAfter Effects CC 2015 – Motion Graphics
  7. Final Cut Pro v7.0.3 – Audio Conform
  8. Soundtrack Pro v3.0.1 – Audio Editing and Mixing
  9. Final Cut Pro v7.0.3 – Final Output
  10. Apple Compressor v3.5.3 – Final Encodes

Digital Negative

The beginning of the post-production pipeline is the camera and format selected for the project. When I have a choice, my preferred camera is the Alexa Mini, shooting UHD (3840x2160) ProRes 4444XQ LogC. At a data rate of 1591 Mb/s, we'll suck up ~716Gb for every hour of footage we shoot.

QuickTime support is more-or-less ubiquitous these days and, while LogC support is somewhat hit-or-miss (more on that later), it's possible to take QuickTime Alexa footage through a post pipeline without ever applying a LUT. And, rather than wasting time extolling the benefits of shooting Log, I'll simply suggest you go read Stu's excellent post over at Prolost full of pragmatic wisdom like:

Log, in its many flavors, is a smart, flexible, and powerful way of storing high dynamic range digital cinema imagery. It’s closer to raw than you might think, and often much easier to work with for results of the same or better quality.

Dailies

As a sensible, rational filmmaker, I create dailies for my offline edit. I do not edit with native camera negative files. While it is fast to AMA-import native files into Avid and just start editing, the time gained is quickly lost, many times over, by the slowdown caused by the massive file sizes of most modern cameras.

When I sit down at my desk after a shoot, I have 2 identical hard drives containing my camera negatives. One drive is transfered to a Drobo for archival, and the other drive is dumped into DaVinci Resolve.

In Resolve, all clips are dropped onto a timeline and, in the case of shooting with Alexa, the built-in Alexa LogC LUT is applied. Jumping straight to the Delivery tab of the application, I load a preset to create 1080p DNxHD 36 MXF files of each clip, careful to maintain the clip's original name.

The resulting MXF files are moved into /Avid MediaFiles/MXF/1 via the Finder. When Avid is launched, the drive is re-indexed and, once in my project, I locate the clips with the Media Tool, and sort them into bins.

Back when MXF support was less widespread and I thought flexibility was more important than hard disk space, I used to create DNxHD 36 QuickTime files as my dailies, rather than MXF files. While it's a perfectly reasonable option with a comparatively quick import time into Avid, it creates 2 copies of my dailies (The MXFs in the Avid MediaFiles folder and the QT files that were imported), as well as adds additional complexity and confusion to conforming and relinking sequences later.

And, before you say "how is it confusing to have just one more copy of your footage", let me remind you that you may be relinking your sequences next week or next year. The fewer potential hurdles you put in your own way, the fewer curse words you'll shout at your former self. If I had the QT dailies as well as the MXF files, which would I archive? Both? Just one? Which one? Skip the headache and render MXF dailies for direct import into Avid.

One "issue" with Resolve's rendering that's worth mentioning. I don't know what I did to the program to make it angry, but when I click "Render", it throws out my timeline pixel aspect ratio settings and starts rendering with the Cinemascope ratio. I have to immediately cancel the render, delete the files it began to create in the Finder, go back to the Edit page, open my project settings, and set the pixel aspect ratio back to square, where it was before I pressed render, then press Render again, and everything will export correctly. This happens every single time I render anything, regardless of project or clip settings.

Update - DaVinci Resolve 12.5.1

During the writing of this post, Blackmagic Design released DaVinci Resolve 12.5.1 which offers a solution to my pixel aspect ratio woes. On the Delivery page, in the Advanced Settings section of the Video output settings, Resolve now has a control for pixel aspect ratio.

Strangely, for me, the pixel aspect ratio still defaults to Cinemascope, despite the project being set to Square, but changing this new setting to Square prior to rendering will actually render files with a Square pixel aspect ratio. I'm no longer required to cancel the render, delete files in the Finder, change the setting on the Edit page, and render again.

Offline Edit

I currently edit my projects in Avid Media Composer 7.0.3. While I also have the option to edit in Premiere Pro CC, Final Cut Pro 7, and DaVinci Resolve, there's still no better editing tool than Media Composer. It's fast, efficient, and accurate.

In fact, the core editing tools in Media Composer, which have remained largely unchanged for decades, are so good that I find very little incentive to upgrade to the current version of the application. Since the introduction of the Smart Tool and the addition of tabbed bins in version 7, I find most new features are nice-to-haves.

I am occasionally envious of some of the newer features in other NLEs but, when it comes to the task of editing, Media Composer just can't be beat. I honestly wish I liked editing in Premiere. The timeline integration with AfterEffects is a fantastic feature I'd love to use but, unfortunately, I find Premiere slow and frustrating to use.

Oddly, Final Cut Pro 7 is still an essential part of my post workflow, just not in the editing phase. We'll get to that.

Reviews

As a brief aside, I will mention one tool I use at many different stages of post-production that you may find useful. Often times, while working on a video, it may be necessary to create a QuickTime file of the work-in-progress.

Long ago, I settled on the encoding settings that I considered the appropriate balance of file size and image quality. Those settings were saved as a Compressor Template.

Remember Compressor? That app that comes with Final Cut Studio for making encodes of things? As it turns out, Compressor can be used from the Command Line in Terminal. Which is, frankly, a terrible idea unless you also use something like Hazel to run the commands in the background while you continue to work on more important things.

I have a handful of folders set up on my computer, each with their own Hazel rule for creating a predetermined file type. When I need, for example, a 1080p H264 file, I export a Same As Source QuickTime file from Avid into the 1080 folder, and Hazel does the rest for me. When the encode is finished, Hazel opens a Finder window, showing me the final file.

Here's what the 1080 Hazel folder rule looks like:

The 1080 folder containing the Same As Source file from Avid is in my Home directory, and the Renders folder for the finished QuickTime file is in my Dropbox folder, making it quick to generate a shareable link to send to whomever.

I have not automated the deletion of the Same As Source file because I may want to create another encode from it (maybe half-rez) or, if it happens to be the final version of the video, I'll want to save the Same As Source file as part of the final project archive.

I do things this way for a couple of reasons. The number-one reason is that exporting a Same As Source QuickTime file from Avid is an order of magnitude faster than exporting an H264 file from Avid. Probably 2 orders of magnitude. Which only matters when you remember that, when Avid is importing and exporting files, the rest of the application is inaccessible. With Compressor and Hazel I can export my file quickly, let it convert in the background, and get back to work while I wait to be presented with that Finder window.

Another major benefit of making encodes this way is that it works with any source application, not just Avid. Just dump a file into the folder and away it goes.

Conforming

Prep

Here's where things start to get ugly. I need to turn my un-color-corrected rough edit full of slap-comps and minimally-viable audio into a final, polished product.

Before leaving Avid, I need to do a few things to prep the sequence for Conform and Online. Step 1 is, of course, duplicating the locked edit and starting with a fresh copy of the timeline. If I haven't already, I create a 1080p H264 QT "reference" file of the locked edit to match against the conformed timeline in Nuke Studio.

Next, I replace any audio that came in attached to the Dailies with the original WAV files from the Sound Recordist. I do this manually, on a new track, so I can verify that the timecode sync sent from the smart slate is identical to the timecode the camera recorded. It's usually off by a frame or two, so I line the originals up via waveform and double check it by listening to playback. Avid's timecode displays on the Viewer windows make it easy to find the starting timecode of a source clip on the timeline and the corresponding in-point on the WAV file. Since I primarily work on sequences under 2 minutes in length, this process only takes 5 or 10 minutes.

The last thing I need is an AAF of my sequence. It includes all video and audio tracks and is named to match the sequence from which it was created.

Note: if the Avid timeline has an abundance of effects applied to clips, often times they will need to be removed before creating the AAF. The AAF format is capable of transferring some basic effects to Nuke Studio, like Transforms, but if any of the clips are stacked or collapsed into a Submaster, the AAF will not transfer the layers within the effect. You must manually pull out each layer onto the main video tracks in order to have them all included in the AAF.

Picture Conform

The freshly minted AAF file is imported into a new project in Nuke Studio. Project settings match the final output; typically 1080p 23.976.

As expected, the sequence shows up with all clips offline. Annoyingly, none of the audio tracks are imported because Nuke Studio does not support importing audio via AAF. It does support the manual addition of audio tracks and basic audio editing tools (similar to its video editing tools) so I have no idea why audio is excluded. It is maddening and, frankly, unacceptable for an application that costs nearly $10,000.

The reference QuickTime file is imported below the other video tracks, then the sequence is relinked to the original UHD camera negative files, adjusted visually using Nuke Studio's wipe and difference viewer tools for any sync issues that may have occurred, and scaled and/or cropped to fit the appropriate raster and letterbox sizes.

VFX

To save us all a lot of time, I'll leave out the if/then/while loop of steps one has to go through when creating the complex export structures required to send individual shots to NukeX for VFX work and have them return to the timeline layer above their original plate (The Hiero portion of Nuke Studio). A task for which Nuke Studio is specifically designed and a task at which it fails to perform correctly 3 out of 5 times.

The shots that need VFX work are selected and exported into Nuke scripts with 12 frame handles. The work is completed, the shots are rendered as EXR sequences, and the renders are imported back into the timeline on a new video track, either automatically or via Nuke Studio's Build Track tool. Since most projects are shot on Alexa, and not all shots require VFX work, the EXR sequences are rendered with Alexa LogC colorspace so everything in the final timeline uses the same settings.

Rendering the EXR sequences with the Alexa LUT is important for more than just consistency. Alexa LogC footage requires a LUT be applied to linearize the footage for VFX work, and a second LUT to preview the linearized footage in a "monitor-space" environment. Which is to say, even after you've removed the Log gamma curve from the footage, you need to tone-map the footage to bring the white-point down to 1.0.

This 2-step process in Nuke is often a 1-step process in other applications like, say, Resolve. Resolve uses a single 3D LUT to linearize and preview LogC footage so, if your EXR sequences are rendered with a linear gamma (the default for EXR sequences), they will look very very wrong with the Resolve Alexa LUT applied.

It's also worth noting that, while Nuke comes with the "linearizing" LUT for Alexa footage pre-installed, users need to download the viewer LUT from Arri's website and manually install it in their menu.py file with python scripting.

Additionally, since Nuke Studio is the awkward love-child of Hiero and Nuke, it contains two separate viewers; one for the Timeline and one for the Node Graph. So, installing the Alexa viewer LUT for use in the Node Graph viewer does not install the LUT in the Timeline viewer.

The Timeline Viewer gamma controls.

The Node Graph Viewer gamma controls.

Personally, I've only installed the LUT for the Node Graph viewer because:

  1. Compositing is the most important place to maintain proper color management.
  2. I know how to install it in the Node Graph viewer because it's the "Nuke" half of the application and Hiero is more difficult to customize.
  3. It was a pain in the ass to do it once and I don't feel like doing it a second time.

The unfortunate result is that my footage looks different in the Node Graph viewer and the Timeline viewer. Which sucks, but since I'm not doing any color work in the Timeline, and I made sure my footage was properly color-managed when I was doing the compositing work in the Node Graph, I know it'll look right when I move everything to Resolve.

Color Correction

When the VFX work is complete and the sequence is ready to be color corrected, I export an XML of the timeline from Nuke Studio. Why an XML instead of an AAF like the one I exported from Avid? Because, while Nuke Studio is capable of importing AAF, XML, and EDL files, it's only able to create XML and EDL files.

No, I don't know why.

And, since Avid can only create AAF and EDL files, I have to use 2 separate "professional file interchange formats" in my workflow. Makes total sense.

EDL is out of the question because it's the oldest of all exchange formats, with the fewest features, and a separate EDL has to be created for every video track in a given timeline. No thank you.

Resolve typically imports and relinks the XML without any issues. It does, however, fail to recognize that the EXR sequences have handles, so the vfx shots need to be slipped 12 frames in the timeline before proceeding to color correction.

A Question for the Audience

Being that all of our footage is Alexa LogC footage, at some point, as part of the color correction process, we should be using Resolve's built-in 3D Alexa LUT to linearize the footage and convert it to monitor-space. But, if you recall from earlier, Resolve doesn't separate the linearization LUT from the viewer LUT like Nuke does. So the question is do we apply the LUT to the clips in the Media page before color correcting, or as a Node on the Color page? Should it be the first or last Node on a Shot? Or added to the Timeline so all clips can be corrected with one global Node? Some of those options affect preview thumbnails in the app. Does that bother you?

Clearly, I'm not sure of the correct answer. I've found that, no matter which option I choose, the results are questionable.

Ideally, we would linearize the footage prior to performing any color transformations so our math is correct, then we'd use a viewer LUT to view the tone mapped image in monitor-space. Just like Nuke does. But, again, we don't have those kind of tools in Resolve (by default). This is not simply an issue of semantics. While it's possible to arrive at the same final image, regardless of the order in which you add the LUT, getting it backwards will adversely affect the experience of using the color correction tools.

Adding the LUT before color correcting makes the Color Wheel tools in Resolve sensitive to the point that small color changes are near impossible. Which is understandable when you think about it. The wheels are expecting an input image with linear values across a certain range. When the values are compressed with an inverse-Log gamma curve, they will not change as you'd expect them to when you move the color wheel.

And this goes for almost any color space in any color correction tool. If you've ever had the experience of dragging a color wheel or slider and seen the image change much more dramatically than you expected, chances are the color tool was expecting a different gamma or color space than the image being fed into it.

Which is why, while a pain in the ass to use, the 2-step process of linearizing Alexa footage in Nuke is necessary to apply mathematical operations correctly. It's too bad Nuke's color correction tools are a bigger disaster than Resovle's LUT issues.

Regardless, this issue is the reason I was hesitant for a long time to make Resolve my primary color correction application. It has a (mostly) fixed UI and I found the small size of the color wheels and sliders frustrating to manage with a mouse or Wacom pen. But my frustration was due almost entirely to the sensitivity created by using incorrect LUT settings on my clips. When clips are correctly color managed, Resolve's color tools are much easier to use. Not great, but easier.

This is why, up until this year, I color corrected my projects in After Effects with Red Giant Colorista. I love Colorista. Love it. Its color wheels are dampened for smooth adjustments and include fast, easy to use tools for HSL adjustments that are a dream to work with. The reasons I went in search of a dedicated color correction application like Resolve had nothing to do with Colorista's toolset, and everything to do with my frustrations with After Effects.

Back to Color Correction

Assuming we've struggled our way through the LUT and interface difficulties and created a color corrected image we're happy with, the next step is to export the clips.

Depending on the project, this step will vary. If all work on the "picture" portion of the video is now complete, a single 1080p ProRes 4444 QuickTime file will be rendered of the entire sequence.

If the project requires Motion Graphics work that I either couldn't, or didn't want to create in during the VFX stage, each shot will be rendered as a separate ProRes 4444 QuickTime file.

The clips are rendered into their own subfolder, with their original names plus some sort of modifier appended to the filename, indicating they are the color corrected versions of the clip (typically _CC). As long as the names are consistent, they're easily relinked to an XML with Nuke Studio's bevy of conforming tools.

Motion Graphics

Assuming this is a project that needs Motion Graphics work, the individually-rendered shots are re-conformed into a sequence in one of two possible applications.

If I can do the work in Nuke, I will. Nuke isn't necessarily built for motion graphics, but it is my app of choice for most tasks. And, with the addition of my Node Sets gizmos, it's not as difficult to coordinate complex animations as it once was. The color-corrected QuickTimes are conformed back into Nuke Studio with the Build Track tool, and work begins, similar to VFX phase of post.

There are, however, certain motion graphics tasks that are better suited to being completed in After Effects (read: anything to do with text). In which case the XML that was previously imported into Resolve is imported into After Effects using the Pro Import After Effects option, formerly known as Automatic Duck. The sequence comes in offline, and each clip is manually relinked to the corresponding color-corrected file.

Yes, I could conform the color-corrected plates back into Nuke Studio and generate a new XML that references the color-corrected plates in order to save myself the hassle of manually relinking incorrectly named clips in AE but, more often than not, that method fails to reconnect all clips and manual relinking is needed anyway so I save myself some time and go straight to manually relinking.

I've also experimented with importing and relinking the XML in Premiere, thinking the NLE would have better luck relinking the clips than the compositing application, in which case I could use Send to After Effects to get it into AE. More complexity, more idiosyncrasies, more failures, more time wasted.

Once motion graphics work is completed, our picture should be locked. A single ProRes 4444 QuickTime file is rendered of the final timeline. This is one instance where I prefer to be working in After Effects. Though AE's renderer can be slow and is not without the occasional glitch, its stability and speed are miles ahead of Nuke Studio, especially with regard to QuickTime files.

By my estimation, the number of successful QuickTime renders I've created with Nuke Studio is likely a single digit percentage. And the time taken to perform the render is somewhere between 2x and 10x the amount of time of other applications.

I recently tried to use Nuke Studio to create dailies instead of using Resolve. The estimated time to create ~20 minutes of dailies was 12 hours and the render failed on the first clip. Resolve knocked out the render on the first try (after correcting the pixel aspect ratio) in less than 30 minutes.

Have I mentioned that Resolve is free and Nuke Studio costs $10,000. I think it's worth mentioning again. Resolve is free and Nuke Studio costs $10,000.

Sound

With picture locked and rendered, now it's time to work on sound. Our original AAF file from Avid is imported into Final Cut Pro 7 with Automatic Duck. Yes, just like FCP 7 itself, the Automatic Duck plugin still works.

The sequence that comes in is our offline edit, but with the original WAV files I manually conformed in Avid prior to the AAF export. The ProRes 4444 file of our finished picture is imported and lined up with the offline timeline. Once aligned, all the offline video tracks are deleted, leaving just the final picture file and the offline audio. Using Reconnect Media, all audio clips are relinked to their original files.

A Quick Sidebar

One hiccup I frequently run into is the naming convention used by whatever audio recorder my sound recordist uses on set. The CF card he hands me on set is full of WAV files with names like 12T03. That is, Scene 12, Take 3. For some reason, the metadata for that file is named 12/03. I assume the / isn't in the file name because whatever file system the recorder is using doesn't play nicely with having a forward-slash in the name.

While I'm sure there's probably a way to have the recorder use the same file name in both places, what this issue requires of me in post is that I batch rename a copy of the original sound files, replacing the T with the / that FCP is searching for. For this task I use Name Mangler, but it could just as easily be done with OS X's batch renaming tools.

Once the files are renamed, Reconnect Media in FCP 7 will now find and relink my files. Next, all audio clips are selected and sent to Soundtrack Pro with the Send to Soundtrack Pro Multitrack Project option.

Once in Soundtrack Pro, I edit, mix, and adjust my audio. I've been using Soundtrack Pro for years. I use it every time I edit an episode of Defocused, so I'm very comfortable and quick with its tools. Some day, when Soundtrack Pro inevitably breaks due to an OS update, I'll likely move to Logic Pro. But, as it stands, this old app does everything I need it to do and it does it quickly.

When work is completed, I export an AIFF file of the timeline and import it back into my FCP 7 project. I duplicate the sequence, keep the final picture, delete all the offline audio, and replace it with my final AIFF audio.

I now have a timeline with a single video track containing the finished picture, and a single audio track containing the finished sound. In some instances I'll end up with 2 audio tracks; one for music, one for everything else, but it's rare.

Final Exports

From the final timeline, a Same As Source QT file is exported to my 1080 folder and the final client encode is created. The Same As Source (ProRes 4444) file and the H264 file are saved together in the project directory.

Revisions

Sometimes, despite our best efforts, revisions need to be made after a "final" file has been delivered. In those instances, I will back up to the part of the process that needs updating, and the individual shot will be taken through the remaining steps by itself.

Once that shot has been updated, it will be placed on a second video track in the final FCP 7 timeline, above the previous clip. From there a new "final" encode will be made.

Since this step is almost always reserved for the last 1 or 2 shots that just need a small tweak, I've never found that the "final" timeline gets too cluttered with single shot revisions. If more than a couple shots need adjusting, the whole timeline goes back through the process with a new version number.

There is Too Much, Let Me Sum Up

Now that we're past the "how it's done" portion of this blog post, let's delve into some opinions.

Avid Media Composer

As I mentioned above, I still think Avid Media Composer is the best tool for editing. Every time I attempt to stray from it, trying something new, I always find myself rushing back to that ugly, antiquated interface that just gets the job done better than the competition.

That said, there are many parts of Media Composer that suffer badly from Avid's "if it ain't broke don't fix it" approach to software evolution. The effects system is stuck in the 1990s and complex animation is best left to other applications. The color correction tools are laughable. The lack of sub-frame audio adjustment confounds me. And things like the 0-255 or 16-235 color ranges, and crop/pad/resize import options, make the application feel like it was built for technicians, not artists.

I could go on, but since most of my complaints are the same complaints we've all had for many years, I'm sure you've heard them all before.

Nuke Studio

Nuke Studio's relinking and conforming tools are easily some of the best and most powerful tools in any application I've used. The timecode and metadata adjustments that can be made in the Spreadsheet tool are basically magic.

The problem with Nuke Studio is really that it's an application full of incredible tools that don't inter-operate with each other in a coherent or successful manner. The current paradigm of the separate Hiero Timeline and Nuke Node Graph is so confusing and broken that I wonder how anyone who wasn't previously using the individual applications could ever understand this impossibly complex, monstrous application.

Even when using the application exactly as intended, I often run into strange edge-cases that create unexpected results. And, for an application that creates and manages numerous, connected files on your hard-disk, with the idea that they'll be distributed to artists on a team, often times backing up and trying again when you encounter an issue is more of a hassle than just pressing on with whatever unusual Nuke script was created when you clicked "Create Comp".

Recently on a project, each of my Nuke scripts had 30 crop nodes after each Read node (one for each clip in the timeline). Only 1 was turned on and active, and the output was correct, so why bother figuring out what caused it? Just shake your head at the expensive application and move on.

Still, NukeX remains the best compositing platform on the market. And, in spite its issues, Nuke Studio does things that no other application I'm aware of can do. The conforming tools, the versioning tools, the roundtripping through vfx back to the timeline. All incredible features that are so great to have at your disposal if/when they work correctly.

AAF/XML/EDL Support and Interoperability

The appeal of Nuke Studio is the promise of seamlessly connecting an editorial timeline to powerful visual effects and color correction tools and applications. But we've had successful post-production pipelines before the creation of Nuke Studio, made possible by the interchange of data via EDL, XML, and AAF files. These files are intended to be an open source, universal language understood by video and audio applications.

In practice, support for these formats is frequently incomplete. I mentioned Nuke Studio's confounding lack of audio import support, but the one that really gets me is the exchange of basic effects. At some point, support for recognizing effects in AAF files was added to Nuke Studio.

Transforms usually come in correctly. Dissolves are usually deleted in favor of creating them again in NukeX because the Nuke Studio timeline doesn't honor clip transparencies by default, creating confusing results when dissolves are added to video tracks above V1. Nearly all other effects in the AAF file are ignored.

Specifics aside, there's no real way to know what pieces will and won't be reorganized by a given application in your pipeline. And most applications won't let you know there were additional effects in the file it was unable to interpret. Which is why that QuickTime reference export of the locked offline edit is so important. All we can really count on is that a basic sequence of clips will move from one application's timeline to another.

While I don't expect complete compatibility of files between competing applications, I find the current state of exchange tools and formats hugely disappointing.

GPU Acceleration and Other Ways to Ruin Your Day

Hands down, my most frustrating daily obstacle is the GPU in my Early 2013 Retina MacBook Pro. That's right, I do all of this work on a laptop connected to 2 additional external monitors. I love the portability and flexibility of this setup.

But with every software update of Nuke or After Effects or Resolve, more tools within the applications are being "accelerated" by offloading their processing to the GPU. In Nuke, I have the ability to override that acceleration and tell the application to process the effects on the CPU. In Resolve, I do not.

And the result is, depending on the type of footage I'm working with, I'll launch Resolve, open my project, and immediately be presented with a dialogue box telling me my GPU memory is full. If I dismiss the message and attempt to do any work, even something as simple as scrubbing the editorial timeline, my computer will instantly lockup and kernel panic.

The only solution, when presented with this dialogue, is to immediately quit the application, and perform a full reboot of the computer to purge all GPU memory. Not much of a solution. And even then, medium-sized Resolve projects using inter-frame compressed video formats are able to max out the GPU with zero other applications competing for memory.

In all seriousness, the best solution to this problem is to buy a bigger, more expensive computer.

And Finally

At the end of the day, I don't feel great about my post-production workflow. I spend way too much time thinking to myself "there's got to be a better way to do this". And I'll continue to spend too much time thinking that until I find a new, less bad solution.

Or until my frustrations grow large enough to make me start my own software company and build the tools I've been desperately searching for. There's a reason so much of this site is dedicated to custom Gizmos and Python scripts. I continue to be unsatisfied with the tools I use to do my job.

Though, knowing the person I am, I'm not sure I'll ever entirely rid myself of that feeling.