I recently tried figuring out how to build ffmpeg with Nvidia codecs. I'm very new to ffmpeg and codec terminology. How is multithreading for the CLI different than the codecs?
My understanding is some parts have to be done sequentially even though the parts themselves are multithreaded, now the different parts can all be done in parallel.
The long-in-development work for a fully-functional multi-threaded FFmpeg command line has been merged!
FFmpeg is widely-used throughout many industries for video transcoding and in today's many-core world this is a terrific improvement for this key open-source project.
The patches include adding the thread-aware transcode scheduling infrastructure, moving encoding to a separate thread, and various other low-level changes.
Change the main loop and every component (demuxers, decoders, filters, encoders, muxers) to use the previously added transcode scheduler.
There's a recent presentation on this work by developer Anton Khirnov.
It's terrific seeing this merged and will be interesting to see the performance impact in practice.
The original article contains 226 words, the summary contains 103 words. Saved 54%. I'm a bot and I'm open source!