GNOME Bugzilla – Bug 796598
x264enc packaged with Ubuntu 18.04 bionic does not work
Last modified: 2018-06-16 22:32:36 UTC
I'm running a freshly installed copy of Ubuntu 18.04 with the pre-packaged GStreamer 1.14.0. On executing the pipeline: gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc ! h264parse ! rtph264pay ! udpsink ...the output is the following before crashing: Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data stream error. Additional debug info: gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: streaming stopped, reason not-negotiated (-4) Execution ended after 0:00:00.000707635 Setting pipeline to PAUSED ... Setting pipeline to READY ... Setting pipeline to NULL ... Freeing pipeline ... I have verified /dev/video0 is online and supports 480p raw video. The same pipeline works correctly on my second laptop running Ubuntu 16.04. Lastly, other pipelines not using x264enc like gst-launch-1.0 v4l2src ! jpegdec ! xvimagesink work fine.
Could you attach the deb.log.xz file created by $ GST_DEBUG=*:6 gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc ! h264parse ! rtph264pay ! udpsink 2>dbg.log $ xz -9 dbg.log please?
And does it work if you add a videoconvert element between v4l2src and x264enc?
Created attachment 372693 [details] Log file for the pipeline Created using: GST_DEBUG=*:6 gst-launch-1.0 v4l2src ! video/x-raw,width=640,height=480 ! x264enc ! h264parse ! rtph264pay ! udpsink 2>dbg.log
(In reply to Tim-Philipp Müller from comment #2) > And does it work if you add a videoconvert element between v4l2src and > x264enc? Yes! It does start sending UDP packets when I add a videoconvert element just before x264enc. However, I also get the message: "baseline profile doesn't support 4:2:2" in my C-pipeline so there still might be an error in the pipeline.
probed caps according to log: video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1; video/x-raw, format=(string)YUY2, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)15/1; video/x-raw, format=(string)YUY2, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 20/1, 15/1, 10/1 }; video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; video/x-raw, format=(string)YUY2, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; video/x-raw, format=(string)YUY2, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; video/x-raw, format=(string)YUY2, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)848, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)640, height=(int)360, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)424, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 }; image/jpeg, width=(int)320, height=(int)180, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 30/1, 15/1 } Note: all format=YUY2 (or image/jpeg). x264enc accepts the following according to gst-inspect-1.0 x264enc: format: { (string)Y444, (string)Y42B, (string)I420, (string)YV12, (string)NV12, (string)Y444_10LE, (string)I422_10LE, (string)I420_10LE } which as you can see does not include YUY2. This means that a videoconvert element will be needed here. There could be various reasons why it worked before, but most likely is that v4l2src used to be built against libv4l2 (which did conversions/decoding internally in some cases), but is no longer. So currently this doesn't look like a bug.
Is there any way I can restore the older functionality in Bionic then by recompiling v4l2src with libv4l2? Setting the env GST_V4L2_USE_LIBV4L2=1 doesn't fix it either and the pipeline crashes when encoding 1280x720/30fps with videoconvert. This used to be fine before.
Crashes how? Do you have a stack trace?
I'm not able to reproduce it at the moment, but it crashes when changing the caps filter from 480p -> 720p in my adaptive streaming C++ code. Is there any way I can add libv4l2 back without breaking other dependencies?
You can probably recompile the package against libv4l2. Or ask distro packagers if they changed it on purpose.
(In reply to Tim-Philipp Müller from comment #9) > You can probably recompile the package against libv4l2. Or ask distro > packagers if they changed it on purpose. The v4l2src gstreamer page says they stopped building against lbv4l2 after 1.14. Also, I managed to create a log for the above issue on changing the capsfilter. It was too big to add here, so I uploaded it to dropbox here: https://www.dropbox.com/s/xshuy4n5fwdpami/adapt.log.xz?dl=0
To help others with similar issues, I would like to conclude this by saying that I used a Docker image of Ubuntu 16.04 to restore the older versions of gstreamer without affecting my current installation
I think your problem is that the new caps you are setting are not supported by v4l2src-without-libv4l2 any longer. Run $ gst-device-monitor-1.0 Video/Source to see supported caps (or see probed caps above in previous comment). Problem is the usb bandwidth doesn't allow raw video at higher resolutions and framerates, so you will have to get JPEG and decode it (you may just want to do that unconditionally actually).
Yes, this isn't a bug, the GStreamer video converter does a better job then libv4l one. Libv4l2 have several bugs that prevented profressing, so we decided to run without for the time being. Unless disabled by the distro, you can reenable libv4l2 using GST_V4L2_USE_LIBV4L2=1 env. But I strongly encourage you du add video converter in your pipelines. Implicit color conversion are unlikely to come back to be honest.
Nicolas: IMHO the main problem from a usability point is that libv4l used to do jpeg->raw conversion under the hood and now that's not done any more it basically breaks all decent-resolution pipelines and forces users to deal with JPEG decoding in their pipelines. Bit suboptimal, and I have to admit I didn't think of that when we decided to drop libv4l by default.
True, though applications should not rely on it imho, it's slow, tend to crash, does not use hardware acceleration. I've started looking into adding proper jpeg support in cheese (the only OSS app I know using this), but didn't finish. Pipewire isn't using it either, and won't be decoding. To me this pain is a bit needed to get going. In parallel to that, there seems that there is possible engagement now to revamp and repair libv4l2, so maybe this is just tempory.