GNOME Bugzilla – Bug 564957
Cheese is not able to do live theora encoding
Last modified: 2012-01-23 20:07:27 UTC
Please describe the problem: I'm using the camera built into the MSI Wind U100 laptop. lsusb identifies my camera as: Bus 001 Device 002: ID 5986:0203 Acer, Inc This is BisonCam, NB Pro according to my research. When I open Cheese, I se'e myself in the preview window. I can even take pictures. Life is good. If I click on the Video button, I can still see myself in the preview window. When I click on the Start Recording button the preview window turns black. I assume it's recording but I'm not sure. I let it record for about 10 seconds and click Stop Recording. There is a delay of a few seconds where it does not appear to be doing anything; but then one still frame of myself does appear in the preview window after I've clicked Stop Recording. The entire Cheese window then becomes greyed out like the program has stopped responding or something. I was tempted to kill the process but after waiting longer the window went back to normal and I was visible in the preview window again. There is a video recording shown in the bottom part of the window that holds pictures, etc. Double-clicking on this file opens Totem, which reports "An error occurred: Stream contains no data." Examining the file shows that it is zero bytes in size. Steps to reproduce: 1. Open Cheese 2. Click on Video 3. Click Start Recording 4. Record for a few seconds 5. Click Stop Recording Actual results: A zero-byte video file is produced Expected results: A working video file is produced Does this happen every time? Yes Other information:
Hi Jason, thank you for reporting this issue. I have a similar one with my Dell mini 9 netbook. I really believe it's just that atom n270 cannot do live ogg encoding with too higher resolution video. It's just too much for an underpowered processor like this. I can record videos at little resolutions like 160x120 and 320x240, but it's nearly impossibile to take a non-empty video with 640x480. As far as I can tell your webcam supports 1280x1024 too so it's even more difficult for the processor to stand with such a big load. Could you please try to record with lower resolutions to confirm it? Anyway probably cheese shouldn't fail this way or at least it should be able to record at a lower resolution than the displayed one or it should be able to schedule encoding on idle instead of doing it live.
Thank you. I changed the resolution to 160x120 (the smallest I can select) and then opened System Monitor to watch CPU load while recording and varied from 60% to 80%. After I clicked on the Stop Recording button the same behavior continued and the video file was zero bytes long. It appears that reducing resolution had no impact.
Created attachment 130671 [details] Output of cheese -v <metoo/> This is with a Logitech E3100 (it's UVC, and I see a number of reports of dodgy UVC drivers on this bugzilla) on Fedora 10. I jumped the capture res. down to minimum, but it makes no difference: the video's fine while monitoring (and photos are also fine), but trying to record video just gives a black window and empty video file. No crashes, though. It has worked, but I think that was on Fedora 8 before I upgraded; don't remember the versions of anything from then. Recording was between the 'v4l2src name=video_source ...' and 'Cannot cleanly shutdown ...' lines, so no actual debug output whilst trying to record.
*** Bug 578469 has been marked as a duplicate of this bug. ***
(In reply to comment #3) > Created an attachment (id=130671) [edit] > Output of cheese -v > > <metoo/> > This is with a Logitech E3100 (it's UVC, and I see a number of reports of dodgy > UVC drivers on this bugzilla) on Fedora 10. Just to give due credit to uvcvideo developers: There has been an "highly popular" bug with uvcvideo drivers and an old version of libv4l. It was found pretty soon by me and fixed in a couple of days working in conjunction with uvcvideo and libv4l developers. Then ubuntu took something like 4-5 months to actually make the fix available through their repository so we've had tons of duplicates from ubuntu users. So if something is dodgy here that's the ubuntu bug fixing (well, ubuntu "committing upstream fixes"). > I jumped the capture res. down to minimum, but it makes no difference: the > video's fine while monitoring (and photos are also fine), but trying to record > video just gives a black window and empty video file. No crashes, though. Are you sure this is the same bug? What's common in reporters above is the underpowered processor that seems to have issues with live theora encoding.
*** Bug 581243 has been marked as a duplicate of this bug. ***
If it may help, cheese -v returned the following: (cheese:6489): Gtk-CRITICAL **: gtk_list_store_set_valist: assertion `VALID_ITER (iter, list_store)' failed System: ASUS eee PC 701 Release 8.10 (intrepid) Kernel Linux 2.6.27-8-eeepc Gnome 2.24.1 Cheese v 2.24.2
*** Bug 591942 has been marked as a duplicate of this bug. ***
Hello Filippo, I assumed that my Atom N270 processor was too wimply to handle video using the Cheese appliction (laptop: HP 110 Mini netbook running Ubuntu 9.10rc with Atom N270 processor and 1GB of RAM and built-in USB webcam). Then I decided to reboot into WindowsXP and try to record webcam video with the bundled Wndows Movie Maker. It worked like a charm -- smooth video and sound. I will say that it looked as if Movie Maker was buffering the signal first, and then doing the wmv converson; whereas Cheese looked like it was trying to do simultaneous capture and ogg encoding. Maybe that's the resource issue. I am not too concerned about this bug, since as a netbook user I don't really intend to make use of my USB camera for video. The capability to filter still-photos was a lot of fun for making avatars and just mailing silly pictures to my kids. However, I *am* slightly offended that a Windows XP application has a lighter resource footprint than a Linux application. Maybe this is enough to motivate you and the video codec guys?
It seems that perhaps Cheese should be able to buffer the data and encode it later, instead of doing it live. Perhaps a preference item or maybe it can dynamically choose to do that based on available system resources.
(In reply to comment #9) > Hello Filippo, > > I assumed that my Atom N270 processor was too wimply to handle video using the > Cheese appliction (laptop: HP 110 Mini netbook running Ubuntu 9.10rc with Atom > N270 processor and 1GB of RAM and built-in USB webcam). Then I decided to > reboot into WindowsXP and try to record webcam video with the bundled Wndows > Movie Maker. It worked like a charm -- smooth video and sound. I will say that > it looked as if Movie Maker was buffering the signal first, and then doing the > wmv converson; whereas Cheese looked like it was trying to do simultaneous > capture and ogg encoding. Maybe that's the resource issue. I don't know anything about Windows solution. I can guess it doesn't do *live* encoding or, if it does, could be totally possible that it has a better QoS system that drops frames if the cpu cannot stand the heavy load. Anyway you're comparing two completely different cases, different os, different encoder, different application purpose. I'm pretty sure netbooks can easily perform live raw recording and encode everything later if you give them the time they need and enough disk space. We could probably implement something like that in the future but I don't see this happening too soon given our current lack of man power.
I sympathize with your lack of manpower. I think Cheese is a very nice piece of work. My comments are more about the Gnu/Linux model than a direct criticism of Cheese. ***begin quote: Anyway you're comparing two completely different cases, different os, different encoder, different application purpose. ***end quote I disagree that these are different application purposes. I approached this as a naive desktop user (even though I'm not). I tried to use the out-of-the-box Ubuntu application for my webcam. It didn't work with video. Then I used the out-of-the-box Windows application for my webcam (which worked). Same application purpose, different results. Perhaps this topic belongs on Ubuntu's bug tracker (Launchpad). They are promoting Ubuntu as a Windows desktop replacement. Whether that's true really depends on your expectations. Ubuntu (and gnu/Linux in general) is a great desktop for an old guy like me who is comfortable with the command line and just needs OpenOffice, Firefox, and e-mail; but I know that the multi-media and social-media applications (such as capturing and posting webcam videos on YouTube) are much more important to capturing the attention of middle-school and high-school users.
There is an issue with KMS and Xv on some Intel cards under Ubuntu 9.10. It causes cheese to not display any video from the webcam and also to record nothing. Could this be the cause of your problems? http://www.ubuntu.com/getubuntu/releasenotes/910#No%20Xv%20support%20for%20Intel%2082852/855GM%20video%20chips%20with%20KMS
I don't think this is the issue. MoviePlayer (i.e. totem) plays videos well. Quicktime and Adobe play videos too. Cheese will record and play a "video," but the frame-rate is maybe 1 or 2 fps. So clearly it's a resource issue in how Cheese or the ogg codec behave when capturing webcam video, not an issue in playing videos through the chipset. Thanks for directing me to the link. It was an interesting read. I imagine the issue described with Xv will be a post-release fix.
Created attachment 154990 [details] [review] cheese-warn-on-video-encode-overrun.patch We have been getting quite a few complaints about this issue in SUSE. Here is the start of my implementation to fix it against 2.28.1. If you like it I will forward port it to HEAD. The next patch I will write up will give a GUI option to export to AVI and then have Cheese work after the recording stops to convert to ogg.
Created attachment 154993 [details] gstreamer-ogg-vs-avi.sh Is there a huge win for using Theora vs mjpeg? Just curious since I was starting to set a baseline using this script and it doesn't look like it makes a huge difference. For a 60 second video I am seeing: 75M video.avi 73M video.ogg Maybe I am missing something.
(In reply to comment #16) > Created an attachment (id=154993) [details] > gstreamer-ogg-vs-avi.sh > > Is there a huge win for using Theora vs mjpeg? Just curious since I was > starting to set a baseline using this script and it doesn't look like it makes > a huge difference. For a 60 second video I am seeing: > > 75M video.avi > 73M video.ogg > > Maybe I am missing something. I think we don't only care about file size, but mostly we care about the "quality" (smooth or not) of captured video and resource occupation(i.e. %CPU) during capturing.
(In reply to comment #17) > I think we don't only care about file size, but mostly we care about the > "quality" (smooth or not) of captured video and resource occupation(i.e. %CPU) > during capturing. My point is that the avi is higher quality (no compression), less CPU intensive and only slightly larger on disk. So, why wasn't it used.
Thanks Brandon for working on this. It would be great to have your underrun warning info bar included in cheese. It's a bit late for 2.29 unfortunately but we could ask a freeze break or add it in a branch and tell distributors to include the patch if they want. So would you please port it to current git master? About offering different encoders, the best thing would be to use gstreamer encodebin and encoding profiles so that the user can choose himself what to encode to. By the way Zhao Halley from Intel is also looking at this, maybe you want to work together or at least share your results? Please take a look at the following thread in cheese-list: http://mail.gnome.org/archives/cheese-list/2010-January/msg00000.html (it spans into the February archives too) (In reply to comment #18) > My point is that the avi is higher quality (no compression), less CPU intensive > and only slightly larger on disk. So, why wasn't it used. The reason was basically "theora is free". AVI by the way doesn't mean anything. It's just a container just like OGG, the pipeline you posted in your script encodes in jpeg (so there is compression, by default it encodes with quality 85) + raw audio (uncompressed). Here (core duo 2 1,6GHz) with the jpegenc one the cpu load goes to 105% while the theora+vorbis one stays under 50%. The avi file is twice as big as the ogg one.
Does gstreamer have some "QoS" or congestion-control mechanism? That is to say, to sense and pass congestion information from back (sink) to front (source) across the pipeline, then middle elements have a chance to dynamically control the quality, e.g. dropping frames, lowering rate or reducing resolution, etc.
I have 2.29.91 checked out from git to try and do the port but I am hitting errors. Any suggestions? (cheese:6152): Gtk-WARNING **: Attempting to add a widget with type GtkVBox to a container of type CheeseWindow, but the widget is already inside a container of type GtkWindow, the GTK+ FAQ at http://library.gnome.org/devel/gtk-faq/stable/ explains how to reparent a widget. (cheese:6152): Gtk-CRITICAL **: gtk_notebook_append_page: assertion `GTK_IS_WIDGET (child)' failed (cheese:6152): Gtk-WARNING **: SendTo: missing action SendTo (cheese:6152): Gtk-WARNING **: SendByMail: missing action SendByMail (cheese:6152): Gtk-WARNING **: SetAsAccountPhoto: missing action SetAsAccountPhoto (cheese:6152): Gtk-WARNING **: ExportToFSpot: missing action ExportToFSpot (cheese:6152): Gtk-WARNING **: ExportToFlickr: missing action ExportToFlickr (cheese:6152): Gtk-CRITICAL **: gtk_container_add: assertion `GTK_IS_CONTAINER (container)' failed (cheese:6152): Gtk-CRITICAL **: gtk_container_add: assertion `GTK_IS_CONTAINER (container)' failed (cheese:6152): Gtk-CRITICAL **: gtk_widget_show_all: assertion `GTK_IS_WIDGET (widget)' failed (cheese:6152): Gtk-CRITICAL **: gtk_widget_realize: assertion `GTK_WIDGET_ANCHORED (widget) || GTK_IS_INVISIBLE (widget)' failed (cheese:6152): Gtk-CRITICAL **: gtk_window_resize: assertion `width > 0' failed (cheese:6152): Gtk-CRITICAL **: gtk_window_resize: assertion `width > 0' failed
(In reply to comment #20) > Does gstreamer have some "QoS" or congestion-control mechanism? That is to say, > to sense and pass congestion information from back (sink) to front (source) > across the pipeline, then middle elements have a chance to dynamically control > the quality, e.g. dropping frames, lowering rate or reducing resolution, etc. Yes, the gstreamer queue has the ability to signal when it has buffers that are x number of nanoseconds old. I use the feature in my patch in Comment #15. See the max-size-* properties: http://gstreamer.freedesktop.org/data/doc/gstreamer/0.10.1/gstreamer-plugins/html/gstreamer-plugins-queue.html The other solution is frame dropping via the leaky property but that seems too hacky to me. HTH
(In reply to comment #20) > Does gstreamer have some "QoS" or congestion-control mechanism? That is to say, > to sense and pass congestion information from back (sink) to front (source) > across the pipeline, then middle elements have a chance to dynamically control > the quality, e.g. dropping frames, lowering rate or reducing resolution, etc. I think there is some infrastructure for it but encoders don't support it yet. Look at this thread in gst-devel[1], particularly David Schleef message[2]. 1. http://marc.info/?t=124687098700003&r=1&w=2 2. http://marc.info/?l=gstreamer-devel&m=124692152420180&w=2 (In reply to comment #21) > I have 2.29.91 checked out from git to try and do the port but I am hitting > errors. Any suggestions? We refactored quite some bit of cheese-window.c in this release cycle. You are probably using cheese with the old cheese.ui and cheese-ui.xml. A make install should fix it, or just copying those files in $PREFIX/share/cheese/data should be enough.
Created attachment 155608 [details] [review] cheese-warn-on-video-encode-overrun.patch This is the initial proposed patch to add the functionality of Attachment #154990 [details] to Cheese 2.29.91. BUT! I am having problems with the overrun callback not working... printing out the address of camera_signals[OVERRUN] from the class_init and then the callback I get two different addresses which is bad :) ** (cheese:13073): WARNING **: camera_signals[OVERRUN]@0x806ea1c = 226 ** (cheese:13073): WARNING **: camera_signals[OVERRUN]@0xb77165e0 = 0 But the ASM looks right which is further confusing me: From cheese_camera_class_init camera_signals[OVERRUN] = g_signal_new ("overrun", G_OBJECT_CLASS_TYPE (klass), 1a7: a3 10 00 00 00 mov %eax,0x10 1a8: R_386_32 .bss From the callback: /home/philips/trees/cheese/libcheese/cheese-camera.c:295 g_signal_emit (camera, camera_signals[OVERRUN], 0); 449: a1 10 00 00 00 mov 0x10,%eax 44a: R_386_32 .bss 44e: 89 1c 24 mov %ebx,(%esp) 451: c7 44 24 08 00 00 00 movl $0x0,0x8(%esp) 458: 00 459: 89 44 24 04 mov %eax,0x4(%esp) 45d: e8 fc ff ff ff call 45e <cheese_queue_overrun_cb+0x4e> 45e: R_386_PC32 g_signal_emit Perhaps the linker is messing something up? Very confused here. Ideas? gcc (SUSE Linux) 4.5.0 20100302 (experimental) [trunk revision 157166]
Could someone test my patch on something other than openSUSE to figure this out? It should apply to 2.29.91.
(In reply to comment #25) > Could someone test my patch on something other than openSUSE to figure this > out? It should apply to 2.29.91. Hi Brandon, thanks for the patch, I don't have the time to properly review it now but it looks nice and I'm really looking forward to include it in cheese. As a side note, I'd like to use the info bar in the future like some kind of event notification system (e.g. to display non fatal errors like this underrun one or generic informations like "new device attached, use it? yes, no, don't bug me again"). Want to work on this? Regarding the signal address thing, I guess it has something to do with the way we link cheese, the libcheese-common convenience library and cheese-gtk. Particularly, it's missing some symbol, if I remove the -export-symbols line from libcheese/Makefile.am (i.e. export everything), it works with no issue. At the moment I don't have any clue, but I'll try to find what's happening, do you have any idea?
(In reply to comment #26) > At the moment I don't have any clue, but I'll try to find what's happening, do > you have any idea? Little update. The real question is: why doesn't this happen for the other signals? The answer is simple, you're emitting the camera overrun signal from the queue overrun callback, i.e. you're emitting the signal from another thread and apparently camera_signals[] is a local variable in camera thread. I guess removing "static" from camera_signals declaration should fix this. On the other hand I'm still not sure to clearly understand what's happening.
Created attachment 160945 [details] [review] cheese-warn-on-video-encode-overrun-v2.patch Version 2 --------- - Move signals field into CheeseCameraPrivate to avoid having a file global variable in a library.
Filippo- I think this is a reasonable solution to the problem and it works pretty well. The only TODO is to make sure that the video file doesn't get created in this case. Let me know what you think.
*** Bug 619409 has been marked as a duplicate of this bug. ***
Created attachment 167391 [details] testcase for ogg, avi and webm i just updated your script to encode with webm too, however i must say that at least on my computer the avi encoding has the best performance. the gstreamer implementation (0.10.23) is still very glitchy and is not able to produce a valid video file.
*** Bug 621598 has been marked as a duplicate of this bug. ***
*** Bug 587726 has been marked as a duplicate of this bug. ***
*** Bug 584075 has been marked as a duplicate of this bug. ***
*** Bug 585761 has been marked as a duplicate of this bug. ***
*** Bug 625188 has been marked as a duplicate of this bug. ***
*** Bug 613661 has been marked as a duplicate of this bug. ***
*** Bug 623115 has been marked as a duplicate of this bug. ***
*** Bug 584429 has been marked as a duplicate of this bug. ***
*** Bug 619406 has been marked as a duplicate of this bug. ***
*** Bug 608864 has been marked as a duplicate of this bug. ***
I have both a Microsoft LifeCam Cinema and an ATI HDTV Wonder in my system. As a result which is video0 and which is video1 is random. A side effect of this lead to the discovery that cheese will record video (but not sound) if the System->Preferences->Sound has the ATI HDTV Wonder selected for the audio input, and will hang if the Microsoft LifeCam Cinema is selected for the audio input when I attempt to record video. Both sound and video work (when properly selected) on skype. Perhaps this bug is related to sound since the still photos always work, and they don't attempt to record sound. BTW this is on an i7 920 system so I don't think lack of CPU can be blamed.
Douglas please open another bug, as you said yours has nothing to do with the underpowered cpu. The issue you're pointing out seems quite odd as audio and video devices (and audio and video recording pipelines) are completely separated. You can record from any microphone + webcam combination and shouldn't have any issue. The two streams are multiplexed just before saving the video file.
Hallo all, dgsiegel pointed me to this bug. I have same sort of problem and i see that "underpowered cpu" is only a question of definition. I have different PCs in my home and all have different CPUs. But even workstation with Intel Core2Duo E8400 @ 3.00GHz kan have some problems with decoding video in realtime. I see the problem in default settings and it is not easy to solve. For example: my webcam provide resolution of 1600x1200. If i make a photo it's fine, but if i encode video.. it will go not so smooth. This CPU can handle simple encoding with 960x720 , but can't do it with some effects enabled. And only at 320x240 it work with effects OK. I see, it is not really easy to make right choice, what resolution to push. What do you thing about this? Is it possible to do some bench test on first start and write it in config? Or use cpu profile based on speed and instruction set? Any other ideas? It make sense to separate photo and video resolution. I will dig now in the code and see, what else i can find. Regards, Alexey
Alexey, there is no good way to guess cpu capabilities programmatically. The only sane solution to this bug is the one Brandon proposed: - add support for video recording settings (resolution, framerate, quality, encoding presets?) - check if the encoder is too slow by looking at the underrun signal - warn the user about possible issues and ask him to change settings until recording works again It probably still sucks to rely on user action but I think it's the only viable way. So, given the plan is settled, and patches are already there, this bug just waiting for someone (either me sooner or later or anyone who will step in) to port them to current codebase and implement some good user experience around it.
Before reducing resolution i try to find other ways get best result with big resolution. with: perf record cheese i see libtheora use most power - no wounder :D here result of first tests: command used: time gst-launch-0.10 filesrc location=foo.avi ! decodebin ! ffmpegcolorspace ! theoraenc speed-level=1 ! oggmux ! filesink location=foo.ogg ...! theoraenc speed-level=1 !... <<<<-- defaul option real 0m8.884s user 0m8.740s sys 0m0.150s -rw-r--r-- 1 lex lex 4349520 2010-08-14 17:01 foo.ogg ...! theoraenc speed-level=0 !... real 0m9.873s user 0m9.690s sys 0m0.080s -rw-r--r-- 1 lex lex 4399365 2010-08-14 17:09 foo.ogg ...! theoraenc speed-level=2 !... real 0m5.173s user 0m5.000s sys 0m0.130s -rw-r--r-- 1 lex lex 4549929 2010-08-14 17:10 foo.ogg So with "theoraenc speed-level=2" we will get better speed and not some big difference in file size.
@Fillipo, ok. Should i do my performance proposal here or in mailing list?
(In reply to comment #47) > @Fillipo, ok. Should i do my performance proposal here or in mailing list? Nope, feel free to add your tests results here if you think they may be valuable!
(In reply to comment #46) would you mind re-doing the tests on webm and mpeg encoder too?
short to theoraenc: cheese use "keyframe-force=1" option, even this make difference ...! theoraenc speed-level=2 keyframe-force=1 !... real 0m6.290s user 0m6.020s sys 0m0.180s using default setting for keyframe will give ...! theoraenc speed-level=2 !... real 0m5.350s user 0m4.920s sys 0m0.230s
Results for webm use libvpx version 0.9.1-1. Same sample video as above (duration 9 second 5 fps 1600x1200) command line: time gst-launch-0.10 filesrc location=foo.avi ! decodebin ! ffmpegcolorspace ! vp8enc ! webmmux ! filesink location=foo.webm default settings: real 0m39.511s user 0m39.250s sys 0m0.170s for real time it really bad. ...! vp8enc speed=1 !... real 0m12.502s user 0m12.300s sys 0m0.150s ...! vp8enc speed=2 !... real 0m12.461s user 0m12.200s sys 0m0.140s There is no difference between speed 2 and 1 @daniel what mpeg encoder do you mean?
sorry, i meant avi with jpegenc (see the attached shell script)
time gst-launch-0.10 filesrc location=foo.avi ! decodebin ! ffmpegcolorspace ! jpegenc ! avimux ! filesink location=foo_2.avi real 0m3.087s user 0m2.850s sys 0m0.150s -rw-r--r-- 1 lex lex 14420936 2010-08-14 18:50 foo_2.avi jpegenc has best speed but worst file size and not really good quality by default. If we change quality size will change too :/ -rw-r--r-- 1 lex lex 14M 2010-08-14 18:50 foo_2.avi <--- jpegm -rw-r--r-- 1 lex lex 173M 2010-08-14 16:47 foo.avi <--- row -rw-r--r-- 1 lex lex 4,4M 2010-08-14 18:08 foo.ogg -rw-r--r-- 1 lex lex 606K 2010-08-14 18:22 foo.webm
Created attachment 167883 [details] [review] jpegenc patch I changed cheese to record with jpegenc muxed in matroska. On my worksation (Intel Core2Duo E8400 @ 3.00GHz) i able now to record with resolution 1600x1200 + effects. On my netbook i can use 352x288 + effects. Here some more perf tests: time gst-launch-0.10 filesrc location=foo.avi ! decodebin ! ffmpegcolorspace ! jpegenc idct-method=2 quality=80 ! matroskamux ! filesink location=foo_2.avi On Core2Duo E8400 ...! jpegenc idct-method=2 quality=80 !... real 0m2.653s user 0m2.410s sys 0m0.140s ...! jpegenc idct-method=1 quality=80 !... real 0m2.824s user 0m2.800s sys 0m0.050s On Athom N280 ...! jpegenc idct-method=2 quality=80 !... real 0m14.652s user 0m14.365s sys 0m0.584s ...! jpegenc idct-method=1 quality=80 !... real 0m10.882s user 0m10.641s sys 0m0.612s idct-method=2 is bit fester on Core2Duo but mach slower on Athom N280
Last record for today, with ffenc_mjpeg i was able to record 1280x1024 on netbook Athom N280!!! but ffenc_mjpeg has some disadvantages, for example it use bitrate instead of quality, so you need to ajast it all the time.
gst-launch-0.10 filesrc location=foo.avi ! decodebin ! ffmpegcolorspace ! ffenc_mjpeg bitrate=1000000 ! matroskamux ! filesink location=foo_2.avi on Athom N280 real 0m2.834s user 0m2.476s sys 0m0.756s Core2Duo real 0m0.631s user 0m0.540s sys 0m0.150s
Hallo, i was busy to find the way reduce pulseaudio load on recording by cheese. Now i suggested new option for pulsesrc to force input format: https://bugzilla.gnome.org/show_bug.cgi?id=627263
After more digging i know the reason why cheesa failed on my netbook, and probably for other too .D This issue has more part: - first problem, my build in webcam is buggy (probably your too). It report 30fps and do 4-5fps. Some of them do variable framerate. I found it at the beginning but short after it i found theoro encode as many frames as it get. I thought theora is too hard encoder for athom processors. But after it i caompared it with windows tools doing video capture from webcam and didn't found big difference. They use WMAv2 and WMV formats. So my question was: why theora sucks with 4 frames pro secunde? The answer is. It sucks actually with 30 frames/s. Cheese use "videorate" module in gstreamer pipeline. viderate make sure if source announced 30fps - encoder will get 30fps. Even source/webcam has 4fps all the time. - second problem, without videorate we will have variable framerate. So will not have frames per socond by definition. Only thing we can use are timestamps. theoraenc can do timestamps. - third problem. oggmux can't do timestamps. This is the reason, why we used videorate. Now at this step only matroska can help us. matroskamux do support timestamps. So we do not need to generate extra frames and extra cpu load. To make cheese work, we need remove "videorate" and replace "oggmux" to "matroskamux". I well suggest to use "theoraenc speed-level=2". It will give 1-2% bigger files but less CPU usage. @daniel, @filippo: any comments ?
(In reply to comment #58) > After more digging i know the reason why cheesa failed on my netbook, and > probably for other too .D Cool :-) Thank you for all the work you're doing! > This issue has more part: > - first problem, my build in webcam is buggy (probably your too). It report > 30fps and do 4-5fps. Well, buggy isn't exactly the best word. The advertised rate is probably the one the device can reach with optimal light conditions. If there is little light your sensor has to stay open longer to get a well exposed frame. If longer means more than 1/30 of a second you will get a lower rate. Either it does that or gives you dark frames. I wouldn't call it a bug, it's more a physical limit. You could probably get better rates with a better sensor or faster lenses but you cannot expect too much from those cheap little cameras. > Some of them do variable framerate. I found it at the > beginning but short after it i found theoro encode as many frames as it get. I > thought theora is too hard encoder for athom processors. But after it i > caompared it with windows tools doing video capture from webcam and didn't > found big difference. They use WMAv2 and WMV formats. > So my question was: why theora sucks with 4 frames pro secunde? > > The answer is. It sucks actually with 30 frames/s. Cheese use "videorate" > module in gstreamer pipeline. viderate make sure if source announced 30fps - > encoder will get 30fps. Even source/webcam has 4fps all the time. > > - second problem, without videorate we will have variable framerate. So will > not have frames per socond by definition. Only thing we can use are timestamps. > theoraenc can do timestamps. > > - third problem. oggmux can't do timestamps. This is the reason, why we used > videorate. Now at this step only matroska can help us. matroskamux do support > timestamps. So we do not need to generate extra frames and extra cpu load. Exactly that's the reason we have a videorate there ( bug 542014 ). What about vp8+webm? does webm store timestamps? I guess so as the muxer is derived from the matroska one as far as I know. As far as I can tell vp8 is still slower than theora, could you do some more test about it? > To make cheese work, we need remove "videorate" and replace "oggmux" to > "matroskamux". I well suggest to use "theoraenc speed-level=2". It will give > 1-2% bigger files but less CPU usage. > > @daniel, @filippo: any comments ? I think this certainly is one of the things we should do to fix this bug. It's not the only one though as you will have all the same issues if you have good light conditions and your webcam streams at the advertised 30/1 framerate. Another one is to fix pulsesrc to avoid an extra audio format conversion (adding it as a bug dep). Also this doesn't exclude the other stuff I pointed in comment 45 which is still mandatory to properly prevent failures. About theoraenc speed-level, what's the impact on video quality? By the way, would you like to work on this? I mean, add the encoder/muxer changes you're suggesting, port Brandon patch to current codebase and implement some UI around it? I and probably Daniel too would be happy to mentor you, feel free to poke us in #cheese (GimpNET).
Why was Bug 621598 marked as a duplicate? Does this bug plan to use WebM to solve this bug? If so, i am in full support. Better quality, and already much more widely supported than theora. In addition, YouTube is making all videos uploaded in WebM to be made available in WebM in HTML5, so all cheese recording will be viewable using only free software and formats on YouTube.
(In reply to comment #59) > Well, buggy isn't exactly the best word. The advertised rate is probably the > one the device can reach with optimal light conditions. If there is little > light your sensor has to stay open longer to get a well exposed frame. If > longer means more than 1/30 of a second you will get a lower rate. Either it > does that or gives you dark frames. I wouldn't call it a bug, it's more a > physical limit. You could probably get better rates with a better sensor or > faster lenses but you cannot expect too much from those cheap little cameras. You right, i just pointed direct with flashlight to my webcam and got about 20 fps :D > Exactly that's the reason we have a videorate there ( bug 542014 ). > What about vp8+webm? does webm store timestamps? I guess so as the muxer is > derived from the matroska one as far as I know. As far as I can tell vp8 is > still slower than theora, could you do some more test about it? I made some tests, it looks like vp8 do only static framerate, even muxed in matroska. I'm not so optimistic about webm anymore. Looks like it designed for computers what we will have in about 2 years. With 320x240@15fps vp8 use 70% of Core2Duo 3Gz, theora only 30%. If we will do bigger resolution we will need all cores of i7. What about netbooks :D > > To make cheese work, we need remove "videorate" and replace "oggmux" to > > "matroskamux". I well suggest to use "theoraenc speed-level=2". It will give > > 1-2% bigger files but less CPU usage. > > > > @daniel, @filippo: any comments ? > > I think this certainly is one of the things we should do to fix this bug. > > It's not the only one though as you will have all the same issues if you have > good light conditions and your webcam streams at the advertised 30/1 framerate. True. But more of the time we get less. 30 fps is a killer for webcam, even dvd has mostly only 25fps. Only CPU will see difference betwene 30 and 25fps. > Another one is to fix pulsesrc to avoid an extra audio format conversion > (adding it as a bug dep). > > Also this doesn't exclude the other stuff I pointed in comment 45 which is > still mandatory to properly prevent failures. ok > About theoraenc speed-level, what's the impact on video quality? No noticeable difference. > By the way, would you like to work on this? I mean, add the encoder/muxer > changes you're suggesting, port Brandon patch to current codebase and implement > some UI around it? I will try.
(In reply to comment #60) > Why was Bug 621598 marked as a duplicate? Does this bug plan to use WebM to > solve this bug? > > If so, i am in full support. Better quality, and already much more widely > supported than theora. In addition, YouTube is making all videos uploaded in > WebM to be made available in WebM in HTML5, so all cheese recording will be > viewable using only free software and formats on YouTube. What hardware do you use? Can you test this script and post the log you get? $(sleep 20 && pkill gst-launch)& time gst-launch-0.10 -e pulsesrc ! queue ! audioconvert ! vorbisenc ! queue ! webmmux name="muxer" v4l2src ! ffmpegcolorspace ! vp8enc ! muxer. muxer. ! filesink location=foo.webm
(In reply to comment #61) > (In reply to comment #59) > > By the way, would you like to work on this? I mean, add the encoder/muxer > > changes you're suggesting, port Brandon patch to current codebase and implement > > some UI around it? > > I will try. Actually I have a series of patches I need to send upstream that include this fix. Let me submit those today.
(In reply to comment #62) > (In reply to comment #60) > > Why was Bug 621598 marked as a duplicate? Does this bug plan to use WebM to > > solve this bug? > > > > If so, i am in full support. Better quality, and already much more widely > > supported than theora. In addition, YouTube is making all videos uploaded in > > WebM to be made available in WebM in HTML5, so all cheese recording will be > > viewable using only free software and formats on YouTube. > > What hardware do you use? Can you test this script and post the log you get? > > $(sleep 20 && pkill gst-launch)& time gst-launch-0.10 -e pulsesrc ! queue ! > audioconvert ! vorbisenc ! queue ! webmmux name="muxer" v4l2src ! > ffmpegcolorspace ! vp8enc ! muxer. muxer. ! filesink location=foo.webm Was this reply meant for me? Not sure what it has to do with using WebM
(In reply to comment #64) > Was this reply meant for me? Not sure what it has to do with using WebM Yes to you. To let you test if you hardware able to encode realtime webm.
I asked on ogg-dev list about variable frame rate and ogg muxer. Here is the answer: ============================================================= Alexey Fisher wrote: > My question is: should actually ogg support timestamps, also variable > framerate? If yes: it seems to be some where broken. The Ogg mapping for Theora is fixed-framerate. You can hack something into Ogg by using a higher framerate and inserting "duplicate frame" packets (which cost approximately one byte each, including the container overhead), but AFAIK there is no gstreamer support for doing this automatically. ============================================================== So, I think it will be better to do more or less standart ogg file than produce some new mix in matroska. But probably gstreamer should be fixed to do this.
One more bug blocking this one: https://bugzilla.gnome.org/show_bug.cgi?id=627459
(In reply to comment #65) > (In reply to comment #64) > > Was this reply meant for me? Not sure what it has to do with using WebM > > Yes to you. To let you test if you hardware able to encode realtime webm. Ah, i see. I seem to get an error: (sleep 20 && pkill gst-launch)& time gst-launch-0.10 -e pulsesrc ! queue audioconvert ! vorbisenc ! queue ! webmmux name="muxer" v4l2src ! ! vorbisenc ! queue ! webmmux name="muxer" v4l2src ffmpegcolorspace ! vp8enc ! muxer. muxer. ! filesink location=foo.webm ! vp8enc ! muxer. muxer. ! filesink location=foo.webm [3] 5560 (gst-launch-0.10:5561): GStreamer-WARNING **: Name muxer is not unique in bin pipeline0, not adding (gst-launch-0.10:5561): GStreamer-WARNING **: Trying to connect elements that don't share a common ancestor: queue2 and muxer WARNING: erroneous pipeline: link without source element [1] Exit 1 $(sleep 20 && pkill gst-launch) real 0m0.049s user 0m0.040s sys 0m0.000s
(In reply to comment #68) > Ah, i see. I seem to get an error: Please try to copy carefully this script. You made there some copy&paste errors.
(In reply to comment #69) > (In reply to comment #68) > > Ah, i see. I seem to get an error: > > Please try to copy carefully this script. You made there some copy&paste > errors. Ah, it's because email cut it into multiple lines so i lost the spaces. Anyways, here's what i get: [1] 13284 Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstAudioSrcClock Redistribute latency... [1]+ Done $(sleep 20 && pkill gst-launch) Terminated real 0m20.158s user 0m16.710s sys 0m0.470s I have fairly good hardware, but i will try this on some low-end machines ass well and report back.
@Danny: i forgot there is good script from Daniel in attachment to do actually the same plus other formats. And other question: do you get warnings about dropped frames on webm? The point of this test is: if we do 20 second capture and you get result 16 second CPU time, will mean you'll get about 80% CPU load on clean video capture. No effects enabled. If you get more than 20 seconds of CPU time - than there is actually no real time in it. Your CPU need more time than you have. The problem in this test is, "time" will not report dropped frames, but i think gstreamer will do it. And some cams need about 10 second to start.
(In reply to comment #71) > @Danny: i forgot there is good script from Daniel in attachment to do actually > the same plus other formats. > > And other question: do you get warnings about dropped frames on webm? > > The point of this test is: if we do 20 second capture and you get result 16 > second CPU time, will mean you'll get about 80% CPU load on clean video > capture. No effects enabled. If you get more than 20 seconds of CPU time - than > there is actually no real time in it. Your CPU need more time than you have. > > The problem in this test is, "time" will not report dropped frames, but i think > gstreamer will do it. And some cams need about 10 second to start. The script from Daniel seems to not work as well as yours for WebM. If yours would show dropped frames, then i did not see any. Is "user" or "sys" the CPU time?
man time: S (sys) Total number of CPU-seconds used by the system on behalf of the process (in kernel mode), in seconds. U (user) Total number of CPU-seconds that the process used direct‐ ly (in user mode), in seconds.
Just want to inform you that the program GUVCView does a splendid job of recording video on my low-power AMD-CPU that crash when recording video with Cheese. Don't know how they solved the issue but perhaps their code can be helpful.
@Johan in what format do you capture? Resolution, real frame rate? Or take just screenshot of "video & files settings"
(In reply to comment #74) > Just want to inform you that the program GUVCView does a splendid job of > recording video on my low-power AMD-CPU that crash when recording video with > Cheese. Don't know how they solved the issue but perhaps their code can be > helpful. That's probably because your webcam uses mjpeg internally so video recording with guvcview equals to just write the captured video stream to disk with no extra encoding. Unfortunately we don't support mjpeg capabilities at the moment. May be something worth to consider.
@Filippo: You are correct. When I look at the properties of the video file it is 640x480, 15fps and... tada!!... Motion JPEG. Very interesting, I didn't know my webcam had such hidden powers.
Some collected info in one point, comment or suggestions are welcome: https://docs.google.com/document/pub?id=14GlfUdWGGiC-4ZIpZnvjRGAiMOAsyxIHeWYsp3mzEtE
Here is a git tree with the overrun patches applied against 2.31.90 as master is a bit of a mess right now :)
(In reply to comment #79) > Here is a git tree with the overrun patches applied against 2.31.90 as master > is a bit of a mess right now :) Brandon Philips (5): fileutil: remove redundant code camera: move signals array into CheeseCameraPrivate window: move info bar from no-camera to window window: use fileutil's CheeseMediaMode camera: provide an overrun signal and implement callback http://ifup.org/git/?p=cheese.git;a=shortlog;h=refs/heads/overrun git pull git://ifup.org/philips/cheese.git overrun
(In reply to comment #80) > (In reply to comment #79) > > Here is a git tree with the overrun patches applied against 2.31.90 as master > > is a bit of a mess right now :) im really sorry brandon, but 2.31.90 is already in ui and string freeze (code freeze approaching fast) and will be dropped afterwards, so master will be the new cheese.
A second branch which encodes in a seperate thread can be found here: http://ifup.org/git/?p=cheese.git;a=shortlog;h=refs/heads/encode-thread git pull git://ifup.org/philips/cheese.git encode-thread
(In reply to comment #81) > im really sorry brandon, but 2.31.90 is already in ui and string freeze (code > freeze approaching fast) and will be dropped afterwards, so master will be the > new cheese. master is broken and just locks up on my machine. Should I file bugs against master? I thought master was just a playground right now. I broke these patches out very nicely in the hopes that you could port them to master after you get done playing with it. It took me alot of mucking around to even get master to build (because of the high version numbers on mx, clutter and vala) and I still couldn't get it to run.
Review of attachment 160945 [details] [review]: Recently I fixed another bug (bug 620637) coming from this very same issue and, given I forgot about this discussion and your patch, I fixed it emitting the signal by name. I'm not sure which solution is really the best. Having a static global array for signals is pretty common in gobject based modules, not sure if it's just everyone perpetrating an old practice or actually it has some reason, anyway I'd keep it if it doesn't break anything. Please look at my patch in that bug.
Review of attachment 160945 [details] [review]: ::: cheese.orig/libcheese/cheese-camera.c @@ +478,3 @@ + g_object_set (save_queue, "max-size-buffers", 0, NULL); + g_object_set (save_queue, "max-size-bytes", 0, NULL); + g_object_set (save_queue, "max-size-time", G_GINT64_CONSTANT(5000000000), NULL); Overall the cheese-camera part of the patch looks fine, just one comment, how did you choose this parameter? ::: cheese.orig/src/cheese-window.c @@ +446,3 @@ + const gchar *primary_text, + const gchar *secondary_text) +{ As you might have noticed we don't have an infobar anymore so this doesn't really apply to current master. Even if at the moment master is little more than a playground for new technologies that's what the new cheese will be based on. I'd still like to have something like an infobar or a clutter version of it so reject this part but may be useful for the future implementation.
(In reply to comment #84) > Review of attachment 160945 [details] [review]: > > Recently I fixed another bug (bug 620637) coming from this very same issue and, > given I forgot about this discussion and your patch, I fixed it emitting the > signal by name. I'm not sure which solution is really the best. Having a static > global array for signals is pretty common in gobject based modules, not sure if > it's just everyone perpetrating an old practice or actually it has some reason, > anyway I'd keep it if it doesn't break anything. Please look at my patch in > that bug. Obviously here I was talking about the signals into a private field part of the patch, I misread the patch description and believed it contained only this part. (The comment still holds, just related to that part of the patch only)
(In reply to comment #71) > @Danny: i forgot there is good script from Daniel in attachment to do actually > the same plus other formats. > > And other question: do you get warnings about dropped frames on webm? > > The point of this test is: if we do 20 second capture and you get result 16 > second CPU time, will mean you'll get about 80% CPU load on clean video > capture. No effects enabled. If you get more than 20 seconds of CPU time - than > there is actually no real time in it. Your CPU need more time than you have. > > The problem in this test is, "time" will not report dropped frames, but i think > gstreamer will do it. And some cams need about 10 second to start. Would having a lot of other stuff running make a difference? Also, here are some errors i just got after retrying Daniel's script: WARNING: from element /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Can't record audio fast enough Additional debug info: gstbaseaudiosrc.c(822): gst_base_audio_src_create (): /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Dropped 6160 samples. This is most likely because downstream can't keep up and is consuming samples too slowly. I just got a bunch of those the entire time. I have a hard time believe it's that slow. Could there be another bug affecting this? Should i be using a particular version of gstreamer?
(In reply to comment #87) > Would having a lot of other stuff running make a difference? Also, here are > some errors i just got after retrying Daniel's script: Not sure what you mean. Do other stuff make big CPU load? or use same sound card? > I just got a bunch of those the entire time. I have a hard time believe it's > that slow. Could there be another bug affecting this? Should i be using a > particular version of gstreamer? I used native lucid and now maverik versions, without any big problem. The bugs affecting gstreamers vp8 and ogg are still not fixed. I still suggest to use matroska for live recording.
I do not think Bug 621598 is a duplicate of this
New gstreamer is relased. vp8enc bug is fixed now. theoraenc is still not fixed. This work for me just greate (speed=2 is importent): ... ! vp8enc speed=2 ! ... So i thing it is greate time to chaange default format for cheese :D and do not forget to remove "videorate".
Created attachment 179324 [details] Compare execution time of encoding pipelines Hello, I tried to find information about vp8 and theora for live encoding, but I haven't found anything. So, I tried to do a small test based on the one attached here just using time to have an idea of encoding speed. For ogg theora: 16.12user 0.19system 0:33.64elapsed 48%CPU 2.0M file size For WebM vp8: 16.94user 0.15system 0:35.61elapsed 48%CPU 1008K file size For AVI with jpegenc: 17.56user 0.20system 0:32.40elapsed 54%CPU 30M file size The recorded videos also seems to have compatible quality, but I'm not good at seeing that. Theoretically vp8 one should be better. It seems there's not much difference between theora and vp8 right now for live encoding. I have libvpx 0.9.1 and libtheora 1.1.1. Feel free to run other tests in your computer to compare results. Based on that, I don't think switching to webm and vp8 would help with the initial problem (not being able to encode in low power processors). We could switch it anyway, since it's going to mean smaller files and apparently won't increase processing needs. There is an interesting setting in vp8 that sets encoding to real-time, auto-adjusting the speed to processing power available. Unfortunately this is not exposed in vp8enc and I don't know how well it works.
1. vp8enc speed=2 ! ... - is realtime encoding. It disable some optimizations in price of quality and file size. IMHO in kurrent state it make sense to use WebM by default. 2. We can't support all loend CPUs. Or we will go to absurd minimum level, say Pentium I or 486. 3. CPU load depend on framerate and framesize - please reduce it.
@devs Ogg/Theora is suddenly dead. The bugs i committed have no progress, please use WebM instead.
[Removing "on low power processors" from subject as this also happens on faster machines according to Daniel.]
[Removing GNOME3.0 target as decided in release-team meeting on March 03, 2011. This report has an "important" categorisation for GNOME3.0 but is not considered a hard blocker. For querying use the corresponding whiteboard entry added.]
(skimming [gnome3-important] bugs and came across this) We switched from theora to (multithreaded) webm for gnome-shell's built-in screen recorder a while back, for basically this reason. See bug 632595.
*** Bug 629033 has been marked as a duplicate of this bug. ***
*** Bug 660873 has been marked as a duplicate of this bug. ***
*** Bug 629338 has been marked as a duplicate of this bug. ***
*** Bug 632735 has been marked as a duplicate of this bug. ***
Fixed by switching Cheese master to use VP8/WebM as described in bug 666718.