GNOME Bugzilla – Bug 740419
decoders don't report latency
Last modified: 2015-06-01 16:44:54 UTC
None of the gst-vaapi decoder implementations appear to set any min/max decoder latency, causing problems in live playback scenarios. Is there a way to get that info from the vaapi implementation? Decoders should call gst_video_decoder_set_latency() to configure the latency, and the base class will use it to answer latency queries.
I think we could safely assume realtime performance for the HW decoder side, once submitted to the HW. However, for parsing/preparation purposes we'd need at least 1 frame latency in general, with perfectly known unit boundaries (NALU, AU), and up to 2 frames when we need to wait for the second frame start to determine the first frame is complete, i.e. scan for start code + check that this is the start of the next frame actually. WDYT? Or do I misunderstand the expected concepts?
(In reply to comment #0) > None of the gst-vaapi decoder implementations appear to set any min/max decoder > latency, causing problems in live playback scenarios. > > Is there a way to get that info from the vaapi implementation? > > Decoders should call gst_video_decoder_set_latency() to configure the latency, > and the base class will use it to answer latency queries. For some codecs, there is a VA API to determine the number of MBs that can be processed per second. However, to be honest, this is mostly unimplemented right now (VA driver wise), and that API is not even mainlined yet IIRC. It's in the "staging" branch for now.
Assuming realtime decode performance is almost certainly acceptable. Your understanding is correct - see http://cgit.freedesktop.org/gstreamer/gst-plugins-ugly/tree/ext/mpeg2dec/gstmpeg2dec.c#n772 for example.
Created attachment 303774 [details] [review] vaapidecode: calculate decoding latency This is a naïve approach to the calculation of the VA-API decoding latency. It takes into consideration when the frame-rate has some insane value.
Seems OK :)
(In reply to Jan Schmidt from comment #5) > Seems OK :) There's something that worries me is that GstClockTime latency = gst_util_uint64_scale (2, fps_d, fps_n); Always returns 0. It that value useful as latency (no latency)? But if we do GstClockTime latency = gst_util_uint64_scale (2 * GST_SECOND, fps_d, fps_n); We usually get a positive number that, for me, makes sense, since, if I understand correctly, the decoder will produce the first frame after the duration of two frames. #include <gst/gst.h> void tururu (gint fps_n, gint fps_d) { GstClockTime l1, l2; l1 = gst_util_uint64_scale (2, fps_d, fps_n); l2 = gst_util_uint64_scale (2 * GST_SECOND, fps_d, fps_n); g_print ("l1 = %ld / l2 = %ld\n", l1, l2); } int main (int argc, char **argv) { tururu (24, 1); tururu (5000000, 208333); tururu (24000, 1001); return 0; }
Oops - I should have checked the logic more carefully. Yes, you need to multiply by GST_SECOND So, in the case where packet boundaries are known, the latency can be set to GstClockTime latency = gst_util_uint64_scale (GST_SECOND, fps_d, fps_n); and when the decoder needs to collect a ref frame and then a 2nd frame to find the end-of-packet boundary, it needs to be GstClockTime latency = gst_util_uint64_scale (2 * GST_SECOND, fps_d, fps_n); but it's OK to use the larger value for both - it'll just add a fraction of a second to playback delay in live streaming.
(In reply to Jan Schmidt from comment #7) > Oops - I should have checked the logic more carefully. Yes, you need to > multiply by GST_SECOND I said it because you don't multiply in gstmpeg2dec ;) http://cgit.freedesktop.org/gstreamer/gst-plugins-ugly/tree/ext/mpeg2dec/gstmpeg2dec.c#n821 > > So, in the case where packet boundaries are known, the latency can be set to > > GstClockTime latency = gst_util_uint64_scale (GST_SECOND, fps_d, fps_n); > > and when the decoder needs to collect a ref frame and then a 2nd frame to > find the end-of-packet boundary, it needs to be > > GstClockTime latency = gst_util_uint64_scale (2 * GST_SECOND, fps_d, fps_n); > > but it's OK to use the larger value for both - it'll just add a fraction of > a second to playback delay in live streaming. Thanks again!
Attachment 303774 [details] pushed as 26cedfa - vaapidecode: calculate decoding latency