GNOME Bugzilla – Bug 341977
[asfdemux] wrong framerate/timestamps with geeks.wmv
Last modified: 2009-03-06 09:24:34 UTC
This clip is played back with a wrong framerate: http://unrule.info/files/geeks.wmv Video: WMV7 Audio: WMA8
The timestamps asfdemux sets on the video stream aren't exactly confidence-inspiring: one or two reasonable looking timestamps at the beginning, then the rest GST_CLOCK_TIME_NONE (with a few that look like (GstClockTime) <= -2) ... The decoder (ffdec_*) doesn't seem to fill in the missing timestamps with estimates based on the framerate either. (Curiously, mplayer reports a framerate of 1000.0 fps, whereas asfdemux finds it to be 50fps, and 5176 frames for 2:53 minutes seems to suggest something more along the lines of 30fps).
Is this bug still affecting people? I get a similar kind of problem, but with a very low framerate with this command: gst-launch playbin uri=http://mpegmedia.abc.net.au/tv/enoughrope/vodcast/enoughrope_seinfeld1.wmv I don't seem to be able to play the above file. Also, if you download the file and play it, then it seems to work fine. It is just streaming that seems to affect me.
The problems seem to be caused by this segment of code in the gst_asf_demux_process_chunk function: ... if (!stream->fps_known) { if (!stream->cache) { stream->cache = stream->payload; } else { gdouble fps; gint64 diff; gint num, denom; /* why is all this needed anyway? (tpm) */ diff = GST_BUFFER_TIMESTAMP (stream->payload) - GST_BUFFER_TIMESTAMP (stream->cache); fps = (gdouble) GST_SECOND / diff; /* artificial cap */ if (fps >= 50.0) { num = 50; denom = 1; } else if (fps <= 5.0) { num = 5; denom = 1; } else { /* crack alert */ num = (gint) GST_SECOND; while (diff > G_MAXINT) { num = num >> 1; diff = diff >> 1; } denom = (gint) diff; } stream->fps_known = TRUE; stream->caps = gst_caps_make_writable (stream->caps); gst_caps_set_simple (stream->caps, "framerate", GST_TYPE_FRACTION, num, denom, NULL); GST_DEBUG ("set up stream with fps %d/%d", num, denom); gst_pad_use_fixed_caps (stream->pad); gst_pad_set_caps (stream->pad, stream->caps); ret = gst_asf_demux_push_buffer (demux, stream, stream->cache); stream->cache = NULL; ret = gst_asf_demux_push_buffer (demux, stream, stream->payload); stream->payload = NULL; } ... I have run this through GDB and it seems that the "stream" is optimised out in my running version. However the value of "diff" is 279000000. This would give a value for "fps" (which is also optimised out) of 3.58 which is < 5 and hence it sets the frame rate to 5. I do not understand the way in which it thinks the FPS should be calculated by comparing timestamps between buffers. Does anyone know why things are done this way?
OK. This seems to be a duplicate of bug 560348. The code I point to has been completely changed (since Nov 08). It just hasn't propagated to the distro I use yet. I will try and see if this is fixed with the new code.
I can't get the file from the initial report, but the command-line in #2 works fine. It's indeed a duplicate of #560348 *** This bug has been marked as a duplicate of 560348 ***