After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 590681 - audio/video synchronization problem
audio/video synchronization problem
Status: RESOLVED OBSOLETE
Product: GStreamer
Classification: Platform
Component: dont know
0.10.22
Other All
: Normal normal
: git master
Assigned To: GStreamer Maintainers
GStreamer Maintainers
Depends on:
Blocks:
 
 
Reported: 2009-08-03 20:52 UTC by Luc
Modified: 2011-05-20 07:21 UTC
See Also:
GNOME target: ---
GNOME version: 2.25/2.26



Description Luc 2009-08-03 20:52:41 UTC
Please describe the problem:
unable to have video and audio synchronized while recording video from rtspsrc and audio from alsasrc

Steps to reproduce:
1. gst-launch \
	alsasrc device="hw:0,0" ! queue ! wavenc ! mux.\
	rtspsrc latency=0 location=rtsp://cam ! queue ! rtpjpegdepay ! mux. \
	...mux name=mux ! filesink location=outfile

2. gst-launch \
	alsasrc device="hw:0,0" ! queue ! wavenc ! filesink location=out.wav \
	rtspsrc latency=0 location=rtsp://cam ! queue ! rtpjpegdepay ! filesink location=out.mjpeg  &
 
3. 


Actual results:
When using an internal ICH8 soundcard with (1) or (2), got many "audio frames dropped" in the console, immediately or after 20 minutes or less, so video/audio synchro is lost

When using external usb soundcard with (2) theres no more "audio frames dropped" message, but reported framerate in output file differs from framerate reported in SDP attributes, and even if using framerate reported in SDP for muxing, synchronization is lost

The best solution i've found is to stretch or shrink the audio track so that it match the computed video duration (= number_of_video_frames / framerate_stored_in_video_file)

Expected results:
I'd like to be able to mux while i'm recording, and to avoid the audio shrink or stretch step

Does this happen every time?
yes

Other information:
it seems that the framerate reported in the SDP header is not reliable, maybe it should be computed dynamically.

Let me explain my synchronization problems:

I had to save the audio and the video separately because I had many "audio frames dropped" messages in the console when trying to mux in the same time.

When i use the following chain, it seemed to work but after some time, "audio frames dropped" still happend (audio interface was intel ICH8 82801H rev 03):

gst-launch \
	alsasrc device="hw:0,0" ! queue ! wavenc ! filesink location=out.wav \
	rtspsrc latency=0 location=rtsp://cam ! queue ! rtpjpegdepay ! filesink location=out.mjpeg  &

Using an external usb audio card solved the "audio frames dropped" problem,

but then the audio track duration was not matching the duration of the video track: 

the reported framerate when playing out.mjpeg was an integer value (25) different from the framerate reported in the sdp attributes (a real number)

When trying to mux with mencoder, or just preview with:
mplayer out.mjpeg -fps [fps announced in sdp] -audiofile out.wav
or using videorate, the synchronization is not perfect..

To have video and audio synchronized perfectly, I must stretch or shrink the soundfile (using rubberband) so that it matches the video "duration" computed by dividing the number of video frames with the output video file integer framerate...

Then i can encode in two passes with "mplayer -vo yuv4mpeg:file=fifo.y4m" and ffmpeg -vcodec libx264

You understand it would be nice to avoid the "rubberband" step.. :)
Comment 1 Sebastian Dröge (slomo) 2011-05-20 07:21:34 UTC
This should work nowadays. The muxers are using the running time to synchronize now, which should be the cause why you had broken a/v sync before.