GNOME Bugzilla – Bug 744044
GStreamer-CRITICAL **: gst_mini_object_unref: assertion 'mini_object->refcount > 0' failed
Last modified: 2016-02-22 00:27:05 UTC
Hi everybody, I have the following GStreamer-CRITICAL warning and I can't get rid of it: $ gst-launch-1.0 souphttpsrc location="http://IP/video.cgi?videocodec=h264" user-id=USER user-pw=PASSWORD do-timestamp=true ! multipartdemux ! fakesink Setting pipeline to PAUSED ... Pipeline is PREROLLING ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock ^Chandling interrupt. Interrupt: Stopping pipeline ... Execution ended after 0:00:01.692054125 Setting pipeline to PAUSED ... Setting pipeline to READY ... (gst-launch-1.0:12437): GStreamer-CRITICAL **: gst_mini_object_unref: assertion 'mini_object->refcount > 0' failed Setting pipeline to NULL ... Freeing pipeline ... (For the warning to appear I pressed CTRL-C for stopping the stream.) It seems that it is from the multipartdemux element, if I execute the command without it the warning does not appear. I am using Gstreamer 1.4.5. It is very important to me to fix this warning. Can somebody help me with a fix?
If you run your launch line with the env var G_DEBUG=fatal_criticals it will turn your assertion into an abort and it will stop in that line in gdb so we can dig where it happens. See http://live.gnome.org/GettingTraces if you need help on getting a trace with gdb.
Thank you for answering me Thiago. Here you have the backtrace (I did exactly as in the tutorial): (gdb) bt
+ Trace 234624
And here you have the debug messages when running with GST_DEBUG=4 (just the ones near the warning): ^Chandling interrupt. Interrupt: Stopping pipeline ... Execution ended after 0:00:02.052155923 Setting pipeline to PAUSED ... 0:00:02.830411803 20515 0x25d1000 INFO GST_STATES gstbin.c:2230:gst_bin_element_set_state:<fakesink0> current PLAYING pending VOID_PENDING, desired next PAUSED 0:00:02.830433538 20515 0x25d1000 INFO GST_STATES gstbin.c:2679:gst_bin_change_state_func:<pipeline0> child 'fakesink0' is changing state asynchronously to PAUSED 0:00:02.830442689 20515 0x25d1000 INFO GST_STATES gstbin.c:2230:gst_bin_element_set_state:<multipartdemux0> current PLAYING pending VOID_PENDING, desired next PAUSED 0:00:02.830448919 20515 0x25d1000 INFO GST_STATES gstelement.c:2328:gst_element_continue_state:<multipartdemux0> completed state change to PAUSED 0:00:02.830454374 20515 0x25d1000 INFO GST_STATES gstelement.c:2233:_priv_gst_element_state_changed:<multipartdemux0> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending) 0:00:02.830466426 20515 0x25d1000 INFO GST_STATES gstbin.c:2673:gst_bin_change_state_func:<pipeline0> child 'multipartdemux0' changed state to 3(PAUSED) successfully 0:00:02.830473705 20515 0x25d1000 INFO GST_STATES gstbin.c:2230:gst_bin_element_set_state:<souphttpsrc0> current PLAYING pending VOID_PENDING, desired next PAUSED 0:00:02.830480967 20515 0x25d1000 INFO GST_STATES gstelement.c:2328:gst_element_continue_state:<souphttpsrc0> completed state change to PAUSED 0:00:02.830485298 20515 0x25d1000 INFO GST_STATES gstelement.c:2233:_priv_gst_element_state_changed:<souphttpsrc0> notifying about state-changed PLAYING to PAUSED (VOID_PENDING pending) 0:00:02.830492495 20515 0x25d1000 INFO GST_STATES gstbin.c:2673:gst_bin_change_state_func:<pipeline0> child 'souphttpsrc0' changed state to 3(PAUSED) successfully Setting pipeline to READY ... 0:00:02.830516100 20515 0x25d1000 INFO GST_STATES gstbin.c:2230:gst_bin_element_set_state:<fakesink0> current PLAYING pending PAUSED, desired next READY 0:00:02.830625178 20515 0x25d1000 INFO GST_STATES gstelement.c:2328:gst_element_continue_state:<fakesink0> completed state change to READY 0:00:02.830633132 20515 0x25d1000 INFO GST_STATES gstelement.c:2233:_priv_gst_element_state_changed:<fakesink0> notifying about state-changed PLAYING to READY (VOID_PENDING pending) 0:00:02.830640612 20515 0x25d1000 INFO GST_STATES gstbin.c:2673:gst_bin_change_state_func:<pipeline0> child 'fakesink0' changed state to 2(READY) successfully 0:00:02.830649527 20515 0x25d1000 INFO GST_STATES gstbin.c:2230:gst_bin_element_set_state:<multipartdemux0> current PAUSED pending VOID_PENDING, desired next READY (gst-launch-1.0:20515): GStreamer-CRITICAL **: gst_mini_object_unref: assertion 'mini_object->refcount > 0' failed 0:00:02.830868835 20515 0x25d1000 INFO GST_ELEMENT_PADS gstelement.c:763:gst_element_remove_pad:<multipartdemux0> removing pad 'src_0' 0:00:02.830878311 20515 0x25d1000 INFO GST_ELEMENT_PADS gstpad.c:1943:gst_pad_unlink: unlinking multipartdemux0:src_0(0x27029a0) and fakesink0:sink(0x2702540) 0:00:02.830892539 20515 0x25d1000 INFO GST_ELEMENT_PADS gstpad.c:1997:gst_pad_unlink: unlinked multipartdemux0:src_0 and fakesink0:sink 0:00:02.830914595 20515 0x25d1000 INFO GST_STATES gstelement.c:2328:gst_element_continue_state:<multipartdemux0> completed state change to READY 0:00:02.830910456 20515 0x260bd90 INFO basesrc gstbasesrc.c:2841:gst_base_src_loop:<souphttpsrc0> pausing after gst_pad_push() = flushing 0:00:02.830950977 20515 0x260bd90 INFO task gsttask.c:300:gst_task_func:<souphttpsrc0:src> Task going to paused 0:00:02.830922527 20515 0x25d1000 INFO GST_STATES gstelement.c:2233:_priv_gst_element_state_changed:<multipartdemux0> notifying about state-changed PAUSED to READY (VOID_PENDING pending)
As a data point, ^C on the following pipeline seems to be fine: gst-launch-1.0 videotestsrc ! queue ! jpegenc ! queue ! multipartmux name=m ! multipartdemux ! fakesink videotestsrc ! queue ! jpegenc ! m. So possibly an interaction between souphttpsrc and multipartdemux.
Creating a test file with multipartmux then: gst-launch-1.0 souphttpsrc location=http://127.0.0.1:8079/test.mp do-timestamp=true ! multipartdemux ! fakesink works fine as well when ^Cing partway through. Are you in a position to test with master to see if it still happens ?
Indicates a refcounting problem "somewhere", but hard to debug or fix without further information. Maybe if you reproduce it in valgrind it will tell us where the event got freed. For now I think we'll have to close this. If it's still an issue sooner or later someone else will come along with a way to reproduce it hopefully, or with a patch :) Of course it's also possible that it got fixed in the meantime.