GNOME Bugzilla – Bug 663617
[gstffmpegdec] Don't use frame-based Multi-threading when upstream is live
Last modified: 2013-04-05 16:36:09 UTC
When using decoders with live streams, we shouldn't use frame-based multi-threading (FF_THREAD_FRAME) since it introduces a latency of one frame per thread. When receiving the first buffer we should query upstream whether it is a live stream, and if it's so, set context->thread_type to FF_THREAD_SLICE only.
I'm not sure what the link is between zero latency mode and a live stream? I think its perfectly acceptable to add some latency in a live stream in some cases. The threading mode should probably be a property on the elements.
(In reply to comment #1) > I'm not sure what the link is between zero latency mode and a live stream? I don't think I mentionned zero-latency (which is a codec-feature). Some people like having low-latency when doing real-time-communication. Even with a zero-latency (no B frames) stream, If you have a 6 core machines and FF_THREAD_FRAME, you'll end up adding a 6 frame-duration latency (i.e. the decoded frame will come *out* of the decoder that much later). With 10fps ... that's 0.6s ... not cool. > I think its perfectly acceptable to add some latency in a live stream in some > cases. Care to expand on that ? > The threading mode should probably be a property on the elements. We could expose it, sure.
This is the same as bug #696501 really. Marking this as dup of the other one, since it's inverse now (we are now using FF_THREAD_SLICE and want to use FF_THREAD_FRAME whenever possible, for better performance). *** This bug has been marked as a duplicate of bug 696501 ***