GNOME Bugzilla – Bug 721153
Raw opus playback does not work
Last modified: 2014-09-15 21:42:14 UTC
Initial message: > i seem to have difficulty playing raw opus > > recorder: autoaudiosrc ! audioconvert ! audioresample ! opusenc ! > filesink location=foo.opus > <run for a few seconds, kill with Ctrl+C> > > player: filesrc location=foo.opus ! opusparse ! opusdec ! audioconvert > ! audioresample ! autoaudiosink > > The player pipeline matches the one in documentation [1]. > > While payloading or muxing is a possibility, i would rather make this > work with raw opus. I am told that a non-GStreamer implementation > (using pulseaudio + libopus) does work, so there's no reason for > GStreamer to not to work as well. > gst-devel reply: > > Yes, this is supposed to work but doesn't. I just fixed one of the bugs > in git [2] but there's still the problem that libopus returns > OPUS_INVALID_PACKET when trying to decode something. Could you file a > bug about that? The mailing list is no bugtracker :) [1] http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/ext/opus/gstopusparse.c [2] http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/commit/?id=fd803921c95a47111da1323acccbcc9bb8c71947
What is the format of that "raw Opus" file ?
Sorry, I did miss the encoding pipeline. This is not meant to work. The opus decoder needs to know the size of the packets in order to decode correctly. Writing the packets concatenated like this is going to lose this information, and thus lead to a stream that cannot be decoded. Closing as invalid.
Hi Vincent, is it possible to add a property to opusdec to tell it what it needs to know about the raw opus file in order to decode it correctly?
You would need to provide information about every single Opus frame as the raw Opus frame contains no frame markers and can't be parsed without this information. Why not just put it into one of the many possible containers?
Not unless that property is a list of the sizes of all the packets that are to follow. It will be easier to place that information inline in the stream (ie, in the form of Ogg muxing, or RTP payloading). Though I guess such a property *might* be possible for CBR Opus since all packets will have the same size (except maybe the last one), but I'm not sure it's a good idea.
(In reply to comment #4) > You would need to provide information about every single Opus frame as the raw > Opus frame contains no frame markers and can't be parsed without this > information. Why not just put it into one of the many possible containers? We have API:s where we can stream audio over HTTP, so no headers and no payloaders. But, we know in advance how the stream looks like e.g. G.711. For Opus we would only use CBR and fixed frame sizes (and all other parameters are constant), knowing this we have successfully tested that it works.
I would prefer to have a separate element for that, which would chunk the data into something opusdec can consume
Regarding a suitable Opus container for live-streaming over HTTP/TCP. The RTP format does not provide any frame-separation mechanism. The Ogg container seems to impose a huge overhead for low-bandwidth and low-latency live Opus audio streams. Any other suggestions?
RTP is suitable for this. Opus has a RTP mapping. GStreamer has RTP payload and depayloader elements for Opus.
(In reply to comment #9) > RTP is suitable for this. Opus has a RTP mapping. GStreamer has RTP payload and > depayloader elements for Opus. Well, RTP packeting does not provide any mechanisms for separating data packets and does not contain any payload size field. So I don't see how that would be helpful in separating the Opus frames in a raw bit-stream, as in the case of a HTTP/TCP stream.
There is two other options (very similar) matroska and webm (pretty much the same). If using matroskamux, don't forget about the streamable property, same for webmmux. We would greatly appreciate if support request are maid on mailing list rather then in a bug entry. This bug remains invalid.
Obviously webmmux and matroskamux will also create a high latency. RTP is what you want here, and you can pass the output of rtpopusdepay directly to opusdec. The RTP mapping contains all information needed to let the decoder work.
To stream RTP over a stream protocol, use RFC-4571 framing. See the rtpstreampay and rtpstreamdepay elements.