After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 725374 - Uvch264src fails with multiple cameras
Uvch264src fails with multiple cameras
Status: RESOLVED INCOMPLETE
Product: GStreamer
Classification: Platform
Component: gst-plugins-bad
1.2.3
Other Linux
: Normal normal
: git master
Assigned To: GStreamer Maintainers
GStreamer Maintainers
Depends on:
Blocks:
 
 
Reported: 2014-02-28 06:37 UTC by Mark Scudder
Modified: 2018-05-07 15:43 UTC
See Also:
GNOME target: ---
GNOME version: ---



Description Mark Scudder 2014-02-28 06:37:08 UTC
I have GStreamer 1.2.3 built and working on a Linux Mint 16 system following Alex Csete's instructions on his wiki (so it's built in my home directory, ~/gst/runtime).

I have three Logitech c920s I would like to record from, and since the camera has a built-in h.264 encoder, my Linux box has enough power to record multiple cameras at once, because it's not doing three realtime h.264 encodes on the CPU, just writing 3mbit/sec streams to disk. I did some experimenting with v4l2src, and verified I could access and write the h.264 stream from all three c920s at once. But some issues with v4l2src led me to want to try uvch264src instead.

Following the examples on Alex's blog, I open a terminal window, run the set_env.sh script, and issue the following command (based on his example but using a filesink instead of videosink, to save the video):

~/gst/runtime/bin/gst-launch-1.0 -v -e uvch264src device=/dev/video0 name=src auto-start=true src.vfsrc ! queue ! video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 ! xvimagesink sync=false src.vidsrc ! queue ! video/x-h264,width=1280,height=720,framerate=30/1 ! filesink location=/home/mark/test1.mp4

This works. A small preview window is displayed on the screen, and the h.264 stream is recorded to test1.mp4, though it has to be run through ffmpeg -i test1.mp4 -vocdec copy fixed.mp4 before it's usable in non-GStreamer-based players. (I don't mind that, if there's no way to properly mux it into a compatible mp4 file in the pipeline.)

I then open another terminal window, run the set_env.sh script, and change the command to connect to the next camera, and write a different file:

~/gst/runtime/bin/gst-launch-1.0 -v -e uvch264src device=/dev/video1 name=src auto-start=true src.vfsrc ! queue ! video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 ! xvimagesink sync=false src.vidsrc ! queue ! video/x-h264,width=1280,height=720,framerate=30/1 ! filesink location=/home/mark/test2.mp4

And it doesn't work. GStreamer displays the following in the terminal window:

Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter2: caps = video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstUvcH264Src:src: ready-for-capture = false
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: num-buffers = -1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: device = /dev/video1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter3: caps = video/x-h264, width=(int)1280, height=(int)720, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter4: caps = video/x-raw, format=(string)YUY2, width=(int)320, height=(int)240, framerate=(fraction)10/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0: num-clock-samples = 0
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0: device-fd = 12
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5: caps = image/jpeg, width=(int)320, height=(int)240, framerate=(fraction)30/1
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0.GstPad:src: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5.GstPad:src: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:jpeg: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstUvcH264MjpgDemux:uvch264mjpgdemux0.GstPad:sink: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstUvcH264Src:src/GstCapsFilter:capsfilter5.GstPad:sink: caps = image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
ERROR: from element /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstUvcH264Src:src/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
EOS on shutdown enabled -- waiting for EOS after Error
Waiting for EOS…

And does nothing until I Ctrl-C it.

If the first pipeline is not running, this pipeline runs just fine, but they won't run in parallel. I don't feel that GStreamer is being clear about what/where the problem is, but I understand since it's in development I have to dig and experiment.  I also don't see a reason it wouldn't work, since Linux seems to enumerate the cameras as separate entities, and any binary should be able to run multiple times on the same box if it's opening different devices. I just don't know where to start. Can someone help me get these working in parallel? Like I said, even though c920 support isn't perfect in v4l2src, I was able to record three cameras in parallel.
Comment 1 Nicolas Dufresne (ndufresne) 2014-02-28 14:09:32 UTC
(In reply to comment #0)
> ~/gst/runtime/bin/gst-launch-1.0 -v -e uvch264src device=/dev/video1 name=src
> auto-start=true src.vfsrc ! queue !
> video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 !
> xvimagesink sync=false src.vidsrc ! queue !
> video/x-h264,width=1280,height=720,framerate=30/1 ! filesink
> location=/home/mark/test2.mp4

You forgot to mux the H264 packet, you need to add a muxer, e.g. mp4mux to you pipeline.
Comment 2 Mark Scudder 2014-02-28 15:06:13 UTC
I changed the pipelines, respectively, to:

~/gst/runtime/bin/gst-launch-1.0 -v -e uvch264src device=/dev/video0 name=src
auto-start=true src.vfsrc ! queue !
video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 !
xvimagesink sync=false src.vidsrc ! queue !
video/x-h264,width=1280,height=720,framerate=30/1 ! mp4mux ! filesink
location=/home/mark/test1.mp4

and 

~/gst/runtime/bin/gst-launch-1.0 -v -e uvch264src device=/dev/video1 name=src
auto-start=true src.vfsrc ! queue !
video/x-raw,format=\(string\)YUY2,width=320,height=240,framerate=10/1 !
xvimagesink sync=false src.vidsrc ! queue !
video/x-h264,width=1280,height=720,framerate=30/1 ! mp4mux ! filesink
location=/home/mark/test2.mp4

Now, both viewfinders will appear in vximagesink windows, but will only show the first frame and they do not move. The test files are 0 bytes when the pipelines are killed.

I then added "h264parse" before "mp4mux" on each pipeline and it went back to behaving exactly as it did without h624parse and mp4mux in the pipeline - first one to be run works, second one fails.

Please set the status to "unresolved" again.
Comment 3 Jan Schmidt 2017-05-21 13:26:19 UTC
More than 3 years late - but did this problem ever get resolved? I don't have 2 cameras to test against, so hard to do any checking. It seems like this must have (somehow) been a kernel problem? In the kernel is the only place the 2 separate GStreamer processes should interact
Comment 4 Sebastian Dröge (slomo) 2018-05-07 15:43:14 UTC
Closing this bug report as no further information has been provided. Please feel free to reopen this bug report if you can provide the information that was asked for in a previous comment.
Thanks!