GNOME Bugzilla – Bug 608770
Unable to see rtp stream (udpsrc)
Last modified: 2010-02-09 09:53:49 UTC
I have an rtp stream that gstreamer it is unable to handle attached a network dump
Created attachment 152808 [details] network dump
what is it supposed to be? where does it come from? how did you try to play it in gstreamer? so many questions...
It is an mpeg4 simple profile stream coming from a verint nextiva s1970e encoder (http://verint.com/video_solutions/microsite.cfm?article_level2a_id=318), I tried : gst-launch-0.10 -v udpsrc port=2546 caps="application/x-rtp, media=(string)video, payload=(int)121, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1" ! rtpmp4vdepay ! ffdec_mpeg4 ! autovideosink
Created attachment 152828 [details] sample video with the pipeline in previous post
here is the output from the posted gst-lauch: New clock: GstSystemClock /GstPipeline:pipeline0/GstRtpMP4VDepay:rtpmp4vdepay0.GstPad:src: caps = video/mpeg, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/GstRtpMP4VDepay:rtpmp4vdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)121, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1 /GstPipeline:pipeline0/ffdec_mpeg4:ffdec_mpeg40.GstPad:sink: caps = video/mpeg, mpegversion=(int)4, systemstream=(boolean)false /GstPipeline:pipeline0/ffdec_mpeg4:ffdec_mpeg40.GstPad:src: caps = video/x-raw-yuv, width=(int)0, height=(int)0, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false /GstPipeline:pipeline0/ffdec_mpeg4:ffdec_mpeg40.GstPad:src: caps = video/x-raw-yuv, width=(int)704, height=(int)576, framerate=(fraction)30000/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)12/11 /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0/GstXvImageSink:autovideosink0-actual-sink-xvimage.GstPad:sink: caps = video/x-raw-yuv, width=(int)704, height=(int)576, framerate=(fraction)30000/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)12/11 /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink: caps = video/x-raw-yuv, width=(int)704, height=(int)576, framerate=(fraction)30000/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)12/11 /GstPipeline:pipeline0/GstAutoVideoSink:autovideosink0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-raw-yuv, width=(int)704, height=(int)576, framerate=(fraction)30000/1, format=(fourcc)I420, interlaced=(boolean)false, pixel-aspect-ratio=(fraction)12/11
This seems like interlaced content with both fields after eachother in the buffer.
Is it something gstreamer can handle? Do you need other infos?
I don't see any indication (not in RFC or in ffmpeg code) that the video is layed out like it is, so I don't know how to detect or decode this properly in an automatic way.
the same encoder can send also jpeg over rtp, but this seems problematic too, I tryed: gst-launch-0.10 -v udpsrc port=2546 caps="application/x-rtp, media=(string)video, payload=(int)26, clock-rate=(int)90000, encoding-name=(string)JPEG" ! rtpjpegdepay ! ffdec_mjpeg ! autovideosink with the following error: Impostazione della pipeline a PAUSED ... La pipeline è viva e non necessita il PREROLL ... Impostazione della pipeline a PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)26, clock-rate=(int)90000, encoding-name=(string)JPEG ERRORE: dall'elemento /GstPipeline:pipeline0/ffdec_mjpeg:ffdec_mjpeg0: Internal GStreamer error: negotiation problem. Please file a bug at http://bugzilla.gnome.org/enter_bug.cgi?product=GStreamer. Informazioni di debug aggiuntive: gstffmpegdec.c(2638): gst_ffmpegdec_chain (): /GstPipeline:pipeline0/ffdec_mjpeg:ffdec_mjpeg0: ffdec_mjpeg: input format was not set before data start Execution ended after 8117389081 ns. Impostazione della pipeline a PAUSED ... Impostazione della pipeline a READY ... /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = NULL /GstPipeline:pipeline0/GstUDPSrc:udpsrc0.GstPad:src: caps = NULL Impostazione della pipeline a NULL ... Esecuzione di free sulla pipeline...
Created attachment 152831 [details] jpeg over rtp
There was a settings in the encoder: Frame Format : Field over Field I changed to Deintelaced, but seems nothing changed yet ffdec_mpeg4 does't recognize the frame size (width and heigth could be 720x288 and no 0x0): /GstPipeline:pipeline0/ffdec_mpeg4:ffdec_mpeg40.GstPad:src: caps = video/x-raw-yuv, width=(int)0, height=(int)0, framerate=(fraction)25/1, format=(fourcc)I420, interlaced=(boolean)false
This looks like a not-so-clever method of forcing an interlaced stream into a format that doesn't handle interlaced (i.e., MPEG-4 SP). If you have the option of using ASP or another MPEG-4 profile, choose it.
No I can choose MPEG-4 SP or MJPEG and for both I can choose frame format between Field over Field,Interlaced Frame,DeInterlaced Frame
the dump in #Comment 10 does not seem to contain RFC2435 compliant jpeg in RTP data. Can you make a dump of MPEG4 when you set the frame format to deinterlaced.
Created attachment 152920 [details] dump for mpeg4 simple profile, reslution 2CIFH (single field) (704x288), Frame Format DeInterlaced Frame
Dump from Comment 15 seems to work for me. the decoder initially receives bogus data but then figures out the real size.
which pipeline, this one doesn't work for me: gst-launch-0.10 -v udpsrc port=2546 caps="application/x-rtp, media=(string)video, payload=(int)121, clock-rate=(int)90000, encoding-name=(string)MP4V-ES, profile-level-id=(string)1" ! rtpmp4vdepay ! ffdec_mpeg4 ! autovideosink Nicola
my fault now works, thanks