Is anyone using Draft these days? Or rolling their own with ffmpeg?
My first try with draft is already throwing up issues with the result not matching what I would get out of nuke with a simple h264 encode.
Draft appears to be applying YCbCr rec601 instead of the default rec709. It’s not a huge colour-shift, but enough for me not to trust this in production.
There doesn’t appear to be any way in draft to configure this aspect of the transcode.
Hi @Patrick !
We moved away from Draft some time ago and are now using a combination of oiiotool and ffmpeg. Personally I am very happy about it and haven’t looked back since. The idea and setup of Draft is not bad. It is just not pushed forward for many years now. There were/are a couple of deal-breakers to me that make it not usable:
- no prores support
- no way to apply an OCIO look or view
- I think also OCIO is also still on 1.0 (could be wrong here)
- DNxHD but no DNxHR support
- no noticeable development for all the years that I used it
So my recommendation goes against it, but curious about other opinions!
Cheers
Hi Patrick
We were using Draft 6+ years ago until we switched to a fully custom solution using OpenImageIO and FFmpeg. My start point was this repo : GitHub - jedypod/generate-dailies: Daily is a tool to transcode scene-linear openexr images into display-referred quicktime movies.
The interesting part of the solution in that there’s no intermediate step to convert to jpegs before feeding those to ffmpeg. A python script processes the input EXRs using the OpenImageIO python bindings and that is piped directly to ffmpeg. OpenImageIO has all the features I need (OCIO and putting slate information comes to mind, but you can also do some comping with it)
My only issue with our solution is encoding speed. I don’t get more than 6 frames per second encoding speed I think. Very variable and there are lot of variables in play, but still, this is very slow when compared with the speed of encoding from Davinci Resolve for instance (or from Adobe Media Encoder)
Regards
Thanks for the replies! I’ll take at that dailies repo, it looks like a nice starting point.
We still use Draft, but want to move away from it. OIIO + FFmpeg is the logical choice, just need to get around to doing that.
Edit: to apply a lut to the Draft, you need to create a LUT object
LUT.SetOCIOConfig(ocio_config)
lut = LUT.CreateOCIOProcessor(input_color_space, output_color_space)
...
lut.Apply(bgFrame)
Yeah it doesn’t immediately support looks, they would need to be defined as colorspaces too.
We parallelize the encode in chunks and then have a final job that stitches all the chunks together. Since the final job only copies many streams into a single file without transcoding, it is quite fast with FFmpeg.
Out of curiosity, are you encoding to mp4 ? Just wondering how you can stitch chunks together if you have B-frames
they probbaly encode to prores or dnxhd/hr which I believe does allow to stitch.
We use WebM. We found we could get smaller files at higher quality over h264. Also, Firefox natively supports WebM.
Use concat in ffmpeg to string together chunks. It should work for h264(as well as webM).
https://trac.ffmpeg.org/wiki/Concatenate