After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 736100 - video-info: doesn't return the right I420 image size for certain resolutions
video-info: doesn't return the right I420 image size for certain resolutions
Status: RESOLVED NOTABUG
Product: GStreamer
Classification: Platform
Component: gst-plugins-base
git master
Other Linux
: Normal normal
: git master
Assigned To: GStreamer Maintainers
GStreamer Maintainers
Depends on:
Blocks:
 
 
Reported: 2014-09-05 06:42 UTC by Aleix Conchillo Flaqué
Modified: 2014-09-08 20:42 UTC
See Also:
GNOME target: ---
GNOME version: ---



Description Aleix Conchillo Flaqué 2014-09-05 06:42:25 UTC
Not sure if the above is the right description for the bug.

I'm integrating I420 frames coming from the webrtc project into GStreamer. And I'm seeing different byte sizes between the two.

For example, an I420 frame with width=1364 and height=768:

webrtc reported size = 1571328
gstreamer reported size = 1572864

From what I know, I420 is 12bits per pixel, which matches webrtc value = 1364x768x1.5 = 1571328.

The error in GStreamer is:

0:00:32.577537792 16018 0x7fd410005800 ERROR                default video-frame.c:152:gst_video_frame_map_id: invalid buffer size 1571328 < 1572864

I have verified also this code:

  GstVideoInfo video_info;

  gst_video_info_init (&video_info);
  gst_video_info_set_format (&video_info, GST_VIDEO_FORMAT_I420,
                             1364, 768);

And printing GST_VIDEO_INFO_SIZE (&video_info) returns 1572864.

Any idea who's wrong?
Comment 1 Sebastian Dröge (slomo) 2014-09-05 07:43:00 UTC
In GStreamer our default stride for I420 is rounded up to an integer multiple of 4. Which will give you
> 1364*768 + 1368/2 * 768/2 + 1368/2 * 768/2 = 1572864

If you want to use a different stride you'll have to use GstVideoMeta.
Comment 2 Nicolas Dufresne (ndufresne) 2014-09-05 12:21:34 UTC
Note that not respecting 4 bytes alignment for stride may prevent you from using GStreamer software converter. I don't know what in WebRTC can give you raw frames, but I suspect it is a bug.
Comment 3 Aleix Conchillo Flaqué 2014-09-05 13:59:57 UTC
Got it. Thank you guys. I'll take a look at the webrtc code.
Comment 4 Aleix Conchillo Flaqué 2014-09-06 18:54:34 UTC
In case anyone else needs this. This is what I end up doing which fixes the problem.

-----

static void
WebRtcI420ToGStreamer (guint8 *dst, guint8 *src, gint width, gint height)
{
  GstVideoInfo info;

  gst_video_info_init (&info);
  gst_video_info_set_format (&info, GST_VIDEO_FORMAT_I420, width, height);

  guint src_y_stride = width;
  guint src_u_stride = (width + 1) / 2;
  guint src_v_stride = (width + 1) / 2;

  guint dst_y_stride = GST_VIDEO_INFO_COMP_STRIDE (&info, 0);
  guint dst_u_stride = GST_VIDEO_INFO_COMP_STRIDE (&info, 1);
  guint dst_v_stride = GST_VIDEO_INFO_COMP_STRIDE (&info, 2);

  // Y
  for (guint h = 0; h < height; h++)
    { memcpy (dst, src, src_y_stride);
      dst += dst_y_stride;
      src += src_y_stride;
    }
  // U
  for (guint h = 0; h < height / 2; h++)
    { memcpy (dst, src, src_u_stride);
      dst += dst_u_stride;
      src += src_u_stride;
    }
  // V
  for (guint h = 0; h < height / 2; h++)
    { memcpy (dst, src, src_v_stride);
      dst += dst_v_stride;
      src += src_v_stride;
    }
}
Comment 5 Sebastian Dröge (slomo) 2014-09-06 21:17:53 UTC
You can also use the GstVideoFrame API: gst_video_frame_copy()
Comment 6 Nicolas Dufresne (ndufresne) 2014-09-07 16:15:51 UTC
Note, in both cases, this isn't a solution to the problem. We should be able to impose on WebRTC the alignment constraints, and allow zero-copy off of it. Otherwise we'd never get high resolution or embedded device to work properly.
Comment 7 Sebastian Dröge (slomo) 2014-09-07 17:35:34 UTC
Yeah, your WebRTC API should ideally allow you to provide it a buffer and the configuration of it (plane offsets, strides, etc)... and otherwise you can still add a GstVideoMeta with the WebRTC strides to the buffer as long as downstream supports GstVideoMeta. No need to copy stuff around in that case.

I assume with WebRTC you mean libwebrtc here?
Comment 8 Aleix Conchillo Flaqué 2014-09-08 04:43:43 UTC
(In reply to comment #7)
> Yeah, your WebRTC API should ideally allow you to provide it a buffer and the
> configuration of it (plane offsets, strides, etc)... and otherwise you can
> still add a GstVideoMeta with the WebRTC strides to the buffer as long as
> downstream supports GstVideoMeta. No need to copy stuff around in that case.
> 
> I assume with WebRTC you mean libwebrtc here?

I have an app written using the WebRTC project, https://code.google.com/p/webrtc/, like if it was a WebRTC app. I then used shmsink to pass the frames to a GStreamer only app. Everything works great so far.

This is the base class for a VideoFrame.

https://code.google.com/p/webrtc/source/browse/trunk/talk/media/base/videoframe.h

And I don't see it supports copying considering different strides, etc.

I don't think GstVideoMeta will travel with the buffer with shmsink. And the other side, where I should add GstVideoMetta, is kind of generic. So, right now, is hard for me to add a special case for the buffer coming from WebRTC.

But it's really handy to know anyways. So, thanks!

Nicolas, is the 4 byte alignment a standard thing in a raw I420 image, or is it something that GStreamer uses by default? Sorry, I am not very well versed in image/color formats. If it's something that it should be really fixed, then I would dig into it and try to fix it.
Comment 9 Sebastian Dröge (slomo) 2014-09-08 05:26:00 UTC
It's a "standard" thing, yes. Or even larger alignments are used, especially by hardware.

The alignments GStreamer uses by default are usually 4 bytes, as is used by the XVideo extension and many other things.
Comment 10 Nicolas Dufresne (ndufresne) 2014-09-08 13:23:08 UTC
For the reference, "many other things" includes ffmpeg/libav in non-direct rendering mode (in direct rendering alignment must be 16 bytes).
Comment 11 Aleix Conchillo Flaqué 2014-09-08 20:42:44 UTC
Thank you guys. I have investigated a little bit more. Internally, webrtc uses libyuv.

And the VideoFrame class provides CopyToPlanes. So, the following code works for me as well:

  size_t size = GST_VIDEO_INFO_SIZE (&video_info);
  guint8 *image = (guint8 *) g_malloc0 (size);

  guint8 *dst_y = image + GST_VIDEO_INFO_COMP_OFFSET (&video_info, 0);
  guint8 *dst_u = image + GST_VIDEO_INFO_COMP_OFFSET (&video_info, 1);
  guint8 *dst_v = image + GST_VIDEO_INFO_COMP_OFFSET (&video_info, 2);

  guint dst_y_stride = GST_VIDEO_INFO_COMP_STRIDE (&video_info, 0);
  guint dst_u_stride = GST_VIDEO_INFO_COMP_STRIDE (&video_info, 1);
  guint dst_v_stride = GST_VIDEO_INFO_COMP_STRIDE (&video_info, 2);

  frame -> CopyToPlanes(dst_y, dst_u, dst_v,
                        dst_y_stride, dst_u_stride, dst_v_stride);

  GstBuffer *buffer = gst_buffer_new_wrapped (image, size);

As Nicolas suggested, it would be better to allow a zero-copy of it.