After an evaluation, GNOME has moved from Bugzilla to GitLab. Learn more about GitLab.
No new issues can be reported in GNOME Bugzilla anymore.
To report an issue in a GNOME project, go to GNOME GitLab.
Do not go to GNOME Gitlab for: Bluefish, Doxygen, GnuCash, GStreamer, java-gnome, LDTP, NetworkManager, Tomboy.
Bug 731204 - androidmedia: Implement zerocopy rendering
androidmedia: Implement zerocopy rendering
Status: RESOLVED FIXED
Product: GStreamer
Classification: Platform
Component: gst-plugins-bad
git master
Other Linux
: Normal enhancement
: 1.7.1
Assigned To: GStreamer Maintainers
GStreamer Maintainers
: 727511 735778 741010 (view as bug list)
Depends on: 748688
Blocks:
 
 
Reported: 2014-06-04 13:12 UTC by cee1
Modified: 2015-10-20 17:55 UTC
See Also:
GNOME target: ---
GNOME version: ---


Attachments
logcat.txt (28.19 KB, text/plain)
2014-06-05 09:29 UTC, Matthieu Bouron
  Details
androidmedia: split jni code to gstjniutils.c (62.79 KB, patch)
2014-06-11 14:10 UTC, Matthieu Bouron
none Details | Review
[PATCH 2/3] androidmedia: WIP add direct rendering support to amcvideodec (66.62 KB, patch)
2014-06-11 14:11 UTC, Matthieu Bouron
none Details | Review
[PATCH 3/3] gstglupload: WIP handle OES textures in meta upload code path (13.06 KB, patch)
2014-06-11 14:11 UTC, Matthieu Bouron
none Details | Review
[PATCH 2/3] androidmedia: WIP add direct rendering support to amcvideodec (66.62 KB, patch)
2014-06-13 15:34 UTC, Matthieu Bouron
reviewed Details | Review
[PATCH 3/3] gstglupload: WIP handle OES textures in meta upload code path (17.30 KB, patch)
2014-06-13 15:35 UTC, Matthieu Bouron
none Details | Review
androidmedia: split jni code to gstjniutils.c (61.51 KB, patch)
2014-06-25 10:27 UTC, Matthieu Bouron
none Details | Review
androidmedia: add direct rendering support to amcvideodec (88.54 KB, patch)
2014-06-25 10:35 UTC, Matthieu Bouron
none Details | Review
[PATCH 1/9] androidmedia: Do not warn we do not support COLOR_FormatSurface (1.00 KB, patch)
2015-05-01 10:18 UTC, Matthieu Bouron
none Details | Review
[PATCH 2/9] androidmedia: Improve debug messages (1.75 KB, patch)
2015-05-01 10:19 UTC, Matthieu Bouron
none Details | Review
[PATCH 3/9] androidmedia: Improve color format debug messages (1.17 KB, patch)
2015-05-01 10:19 UTC, Matthieu Bouron
none Details | Review
[PATCH 4/9] androidmedia: Fix debug statement (%d for a gsize argument) (5.89 KB, patch)
2015-05-01 10:20 UTC, Matthieu Bouron
none Details | Review
[PATCH 5/9] androidmedia: Add support for libart and add more debugging (2.02 KB, patch)
2015-05-01 10:20 UTC, Matthieu Bouron
none Details | Review
[PATCH 6/9] androidmedia: Allow object to be NULL in gst_amc_jni_*unref functions (990 bytes, patch)
2015-05-01 10:21 UTC, Matthieu Bouron
none Details | Review
[PATCH 7/9] androidmedia: Do not flush codec if it is not started (1.10 KB, patch)
2015-05-01 10:21 UTC, Matthieu Bouron
none Details | Review
[PATCH 8/9] androidmedia: Add support for direct rendering in amcvideodec (82.48 KB, patch)
2015-05-01 10:22 UTC, Matthieu Bouron
none Details | Review
[PATCH 9/9] androidmedia: Only support direct rendering for codec that uses unsupported color formats (4.78 KB, patch)
2015-05-01 10:22 UTC, Matthieu Bouron
none Details | Review
[PATCH 6/9] androidmedia: Allow object to be NULL in gst_amc_jni_*unref functions (910 bytes, patch)
2015-05-03 11:53 UTC, Matthieu Bouron
none Details | Review
[PATCH 8/9] androidmedia: Add support for direct rendering in amcvideodec (82.27 KB, patch)
2015-05-03 11:55 UTC, Matthieu Bouron
none Details | Review
[PATCH 9/9] androidmedia: Only support direct rendering for codec that uses unsupported color formats (3.91 KB, patch)
2015-05-03 11:57 UTC, Matthieu Bouron
none Details | Review
[PATCH 8/9] androidmedia: Add support for GL output in amcvideodec (82.22 KB, patch)
2015-05-04 11:42 UTC, Matthieu Bouron
none Details | Review
[PATCH 9/9] Only allow GL output if the decoder has unknown color formats (3.84 KB, patch)
2015-05-04 11:43 UTC, Matthieu Bouron
none Details | Review
[PATCH 8/9] androidmedia: Add support for GL output in amcvideodec (82.29 KB, patch)
2015-05-19 11:36 UTC, Matthieu Bouron
none Details | Review
[PATCH 9/9] Only allow GL output if the decoder has unknown color formats (3.84 KB, patch)
2015-05-19 11:37 UTC, Matthieu Bouron
none Details | Review
Svg of the decodebin pipeline on the nexus, using System Memory (70.87 KB, text/svg)
2015-07-22 14:35 UTC, Mathieu Duponchelle
  Details
Svg of the failing manually constructed pipeline, on the nexus (27.75 KB, text/svg)
2015-07-22 14:36 UTC, Mathieu Duponchelle
  Details

Description cee1 2014-06-04 13:12:12 UTC
gstamcvideodec/gstamcaudiodec can leverage hw decoding through mediacodec. Though, for gstamcvideodec, color format converting will always be a headache.

The mediacodec can decode and render to surface directly, hence an androidsink which eats encoded frames is feasible, which saving many color format converting/copying.

For gstamcvideodec, the reference java code: https://github.com/google/grafika/blob/master/src/com/android/grafika/MoviePlayer.java
i.e.:
1. configure the decoder with a surface
2. releaseOutputBuffer with second param = true (which we need to change the function gst_amc_codec_release_output_buffer, see http://cgit.freedesktop.org/gstreamer/gst-plugins-bad/tree/sys/androidmedia/gstamc.c#n1081)

This may somewhat be a 'playsink' on android?

Note, it may need to add some signals in this androidsink, which will be emited if the decoder stalls(i.e. dequeue output buffer timeout -- observed on some devices), then client can restart androidsink to workaround this.
Comment 1 Matthieu Bouron 2014-06-04 13:36:37 UTC
I'm currently working on supporting hardware decoding in gstamcvideodec by finishing the work Andoni has started and which can be found here:

https://github.com/ylatuya/gst-plugins-bad/commits/amczerocopy

The current implementation initialises the mediacodec decoder with a surface texture.
A new kind of memory will be implemented: AMCVideoMemory which will store the android output buffer and the surface texture which is used by the decoder.

The GL_TEXTURE_UPLOAD_META will also be placed on the output buffers of amcvideodec so they can be uploaded and used by glimagesink for example. The custom gl upload function will attach the surface to a GL texture (which is of OES type) and renders the buffer to it. A fragment shader should then be applied to removed black borders on the resulting image.

I will post a link to my WIP by the end of the day.
Comment 2 cee1 2014-06-05 04:49:29 UTC
(In reply to comment #1)
> I'm currently working on supporting hardware decoding in gstamcvideodec by
> finishing the work Andoni has started and which can be found here:
> 
> https://github.com/ylatuya/gst-plugins-bad/commits/amczerocopy
> 
> The current implementation initialises the mediacodec decoder with a surface
> texture.
Can I specify an external surface for the decoder?

BTW, I haven't tried the code yet, can it be demonstrated currently?
Comment 3 Matthieu Bouron 2014-06-05 09:26:59 UTC
Here is my development branch (WIP):
http://cgit.collabora.com/git/user/mateo/gst-plugins-bad.git/log/?h=amczerocopy-wip

Right now, you can't specify an external surface texture other than the one generated in amcvideodec. But it will be the case in the future.

Caps negotiation is fixed to video/x-raw(meta:GLTextureUploadMeta),format=RGBA in amcvideodec for development purpose. I use the following pipeline to test the code:
souphttpsrc location=http://samples.0x5c.me/sintel_trailer-480p.mp4 ! qtdemux ! h264parse ! amcviddec-omxqcomvideodecoderavc ! glimagesink

There is currently some issue on the GL side i'm still trying to fix. The generated OES texture fails to be binded in the surface's attach_to_gl_context function. See attached logcat.
Comment 4 Matthieu Bouron 2014-06-05 09:29:06 UTC
Created attachment 277938 [details]
logcat.txt
Comment 5 Matthieu Bouron 2014-06-11 14:10:24 UTC
Created attachment 278272 [details] [review]
androidmedia: split jni code to gstjniutils.c
Comment 6 Matthieu Bouron 2014-06-11 14:11:11 UTC
Created attachment 278273 [details] [review]
[PATCH 2/3] androidmedia: WIP add direct rendering support to amcvideodec
Comment 7 Matthieu Bouron 2014-06-11 14:11:50 UTC
Created attachment 278274 [details] [review]
[PATCH 3/3] gstglupload: WIP handle OES textures in meta upload code path
Comment 8 Matthieu Bouron 2014-06-11 14:34:33 UTC
Dev branch updated:
  * caps are no longer hardcoded and some negotiation have been added. Downstream caps are queried in the set_format function before the codec is configured. If downstream caps supports the GLTextureUploadMeta, video/x-raw(meta:GLTextureUploadMeta),format=RGBA is used for the decoder src caps, then gst_video_decoder_negotiate is called and the presence of the meta in the allocation query is checked in the decide_allocation method. If the meta is found, a surface is generated and used. Biggest issue here is, if the query allocation fails, we can no longer configure the codec with a surface afterward. Note : caps negotiation fails with playbin flags=0x3 but work with flags=0x1

  * on the GL part, the EXTERNAL_OES texture is converted back to a 2D / RGBA texture using gstglconvert but it is done through a "hack" wrapping the GL_TEXTURE_EXTERNAL_OES id to GstGlMemory that currently does not support that kind of memory. Is adding support in the GstGLMemory relevant for this kind of texture ? Or adding a custom function dedicated for such conversion better in gstglconvert (not using a buffer as input but an oes texture id ?

Comments welcome :)
Comment 9 Matthieu Bouron 2014-06-12 12:10:23 UTC
(In reply to comment #8)
> Dev branch updated:
>   * caps are no longer hardcoded and some negotiation have been added.
> Downstream caps are queried in the set_format function before the codec is
> configured. If downstream caps supports the GLTextureUploadMeta,
> video/x-raw(meta:GLTextureUploadMeta),format=RGBA is used for the decoder src
> caps, then gst_video_decoder_negotiate is called and the presence of the meta
> in the allocation query is checked in the decide_allocation method. If the meta
> is found, a surface is generated and used. Biggest issue here is, if the query
> allocation fails, we can no longer configure the codec with a surface
> afterward. Note : caps negotiation fails with playbin flags=0x3 but work with
> flags=0x1
> 
>   * on the GL part, the EXTERNAL_OES texture is converted back to a 2D / RGBA
> texture using gstglconvert but it is done through a "hack" wrapping the
> GL_TEXTURE_EXTERNAL_OES id to GstGlMemory that currently does not support that
> kind of memory. Is adding support in the GstGLMemory relevant for this kind of
> texture ? Or adding a custom function dedicated for such conversion better in
> gstglconvert (not using a buffer as input but an oes texture id ?

After some discussion on IRC, It turns out that adding an helper to convert the external oes texture to 2D / RGBA one in gstglupload is the best solution for now.

> 
> Comments welcome :)
Comment 10 Matthieu Bouron 2014-06-13 15:34:37 UTC
Created attachment 278417 [details] [review]
[PATCH 2/3] androidmedia: WIP add direct rendering support to amcvideodec
Comment 11 Matthieu Bouron 2014-06-13 15:35:54 UTC
Created attachment 278418 [details] [review]
[PATCH 3/3] gstglupload: WIP handle OES textures in meta upload code path
Comment 12 Matthieu Bouron 2014-06-13 15:38:43 UTC
(In reply to comment #9)
> (In reply to comment #8)
> > Dev branch updated:
> >   * caps are no longer hardcoded and some negotiation have been added.
> > Downstream caps are queried in the set_format function before the codec is
> > configured. If downstream caps supports the GLTextureUploadMeta,
> > video/x-raw(meta:GLTextureUploadMeta),format=RGBA is used for the decoder src
> > caps, then gst_video_decoder_negotiate is called and the presence of the meta
> > in the allocation query is checked in the decide_allocation method. If the meta
> > is found, a surface is generated and used. Biggest issue here is, if the query
> > allocation fails, we can no longer configure the codec with a surface
> > afterward. Note : caps negotiation fails with playbin flags=0x3 but work with
> > flags=0x1
> > 
> >   * on the GL part, the EXTERNAL_OES texture is converted back to a 2D / RGBA
> > texture using gstglconvert but it is done through a "hack" wrapping the
> > GL_TEXTURE_EXTERNAL_OES id to GstGlMemory that currently does not support that
> > kind of memory. Is adding support in the GstGLMemory relevant for this kind of
> > texture ? Or adding a custom function dedicated for such conversion better in
> > gstglconvert (not using a buffer as input but an oes texture id ?
> 
> After some discussion on IRC, It turns out that adding an helper to convert the
> external oes texture to 2D / RGBA one in gstglupload is the best solution for
> now.

Attached in new patch:
*  The texture conversion is now done in gstglupload.

> 
> > 
> > Comments welcome :)
Comment 13 Matthieu Bouron 2014-06-16 07:57:35 UTC
As described in the surface texture documentation, a transformation matrix should queried from the surface texture and used to sample the resulting texture.

Ref: http://developer.android.com/reference/android/graphics/SurfaceTexture.htm

So we either need to make amcvideomemory/amcdirectbuffer + jniutils public, so we can check the memory type in gstglupload and call gst_amc_surface_texture_get_transformation_matrix from there and use it in the shaders.

OR

Do the whole 2D / RGBA conversion in the upload meta function. But i'm not sure if it is doable since we can't access the GL context from there.

What do you think ?
Comment 14 Matthew Waters (ystreet00) 2014-06-16 10:15:45 UTC
Both of those require one lib to depend on the other which is not really ideal.

At the hackfest, Andoni brought this up and suggested that we have a GstAffineTransformationMeta (or some other name) created that elements can pass 4x4 matrices that contain the transformation that needs to be done in order to display the image.  IIRC, all of these transformations are either flips and/or rotations with angles that are multiples of Π/2 which are reasonably simple to represent and decode with matrices.
Comment 15 Andoni Morales 2014-06-16 10:42:58 UTC
+1 for the new meta to handle the transformation matrix.

In this particular case, the transformation matrix applies a simple translation to remove grey strides introduced by the decoder with even widths.
But we don't really care whether it's a simple rotation, a flip or a translation since it's just a multiplication of the texture coordinates and the 4x4 matrix. The vertex shader should look like:

static const gchar *simple_vertex_shader_str_gles2 =
      "attribute vec4 a_position;   \n"
      "attribute vec2 a_texCoord;   \n"
      "uniform mat4 trans;          \n" 
      "varying vec2 v_texCoord;     \n"
      "void main()                  \n"
      "{                            \n"
      "   gl_Position = a_position; \n"
      "   v_texCoord = (trans * vec4(a_texCoor, 0, 1)).xy; \n"
      "}                            \n";


This can be useful also for https://bugzilla.gnome.org/show_bug.cgi?id=679522
Comment 16 Nicolas Dufresne (ndufresne) 2014-06-16 14:50:02 UTC
How would this new meta will be negotiated ? Is there effects or subtitle elements that should not pass-through this meta ? Would be nice to look ahead of things on this one, solving that few version later has been a bit painful so far. Thanks for looking at this.
Comment 17 Matthieu Bouron 2014-06-18 10:01:51 UTC
We will have to deal with:

1) Meta and process ordering:

Let's have a buffer with the GLTextureUploadMeta, OverlayCompositionMeta and the AffineTransformationMeta.

We should have a way to know which operation should be done first. Because we might want to do:

Upload -> Composition -> Affine Transformation

OR 

Upload -> Affine Transformation -> Composition

2) In this particular case (with amcvideodec):

The transformation matrix is used for the texture sampling, which comes before any further composition. So we can have:

Upload -> Affine Transformation (used during the OES -> 2D/RGBA) based on the transformation matrix returned by the SurfaceTexture -> Composition -> Affine Transformation 2 -> [...].

Unfortunately I don't have a clear solution for this yet.
Comment 18 Nicolas Dufresne (ndufresne) 2014-06-18 14:17:12 UTC
Review of attachment 278417 [details] [review]:

Partial review, if we can split all the style fixes and merge this already, we could shrink a bit this patch.

::: sys/androidmedia/gstamc.c
@@ +3485,3 @@
               profile =
+                  gst_amc_mpeg4_profile_to_string (type->profile_levels[j].
+                  profile);

These look like style fixex, can you split it up ? These can be merge already, would remove some noise in your branch.

::: sys/androidmedia/gstamcaudiodec.c
@@ +490,3 @@
       buffer_info.flags);
 
+  is_eos = !!(buffer_info.flags & BUFFER_FLAG_END_OF_STREAM);

Style again.

::: sys/androidmedia/gstamcvideodec.c
@@ +528,3 @@
+  if (self->use_surface) {
+    gst_format = GST_VIDEO_FORMAT_RGBA;
+  }

No need for {}

@@ +714,3 @@
       gst_util_uint64_scale (buffer_info.presentation_time_us, GST_USECOND, 1));
 
+  is_eos = !!(buffer_info.flags & BUFFER_FLAG_END_OF_STREAM);

Style.

@@ +1088,3 @@
   }
 
+  /* FIXME, TODO: negotiate better ! */

shouldn't this be in negotiate() virtual method to better align with base class negotiation code ? Also, define better, so we can give you some input if needed.

@@ +1128,3 @@
+      gst_caps_unref (gl_texture_upload_caps);
+
+      GST_LOG ("Hardware decoding: %s", self->use_surface ? "enabled" : "disabled");

That seems ambiguous trace, hardware decoding is possible even if not doing direct rendering.

::: sys/androidmedia/gstamcvideomemory.c
@@ +293,3 @@
+  GstVideoGLTextureUploadMeta *meta = NULL;
+  GstVideoGLTextureType texture_types[] =
+      { GST_VIDEO_GL_TEXTURE_TYPE_OES, 0, 0, 0 };

Is it always a single texture/surface ? Could we consider the other way around, which is to request _2D/RGBA texture, and do the conversion here, in our upload() method ? I think it would require a way to query downstream GL context, that minimally mean have access to the decoder src pad when upload is called (see user data in the upload meta). Would that be possible ?
Comment 19 Tim-Philipp Müller 2014-06-18 14:28:29 UTC
Comment on attachment 278417 [details] [review]
[PATCH 2/3] androidmedia: WIP add direct rendering support to amcvideodec

>               profile =
>-                  gst_amc_mpeg4_profile_to_string (type->
>-                  profile_levels[j].profile);
>+                  gst_amc_mpeg4_profile_to_string (type->profile_levels[j].
>+                  profile);

These kind of lines GNU indent will often "ping pong" between different styles, even with the same version. Ideally rewrite the code to avoid such constructs if the end up going over the end of the line (at column 80). Maybe save the lookup into a local variable or so.

>-  is_eos = ! !(buffer_info.flags & BUFFER_FLAG_END_OF_STREAM);
>+  is_eos = !!(buffer_info.flags & BUFFER_FLAG_END_OF_STREAM);

This is '!!' in old GNU indent versions and '!  !' in newer ones. Please don't change it back to the old formatting. Ideally upgrade.
Comment 20 Matthieu Bouron 2014-06-25 10:27:56 UTC
Created attachment 279203 [details] [review]
androidmedia: split jni code to gstjniutils.c

Rebased on master.
Comment 21 Matthieu Bouron 2014-06-25 10:35:08 UTC
Created attachment 279210 [details] [review]
androidmedia: add direct rendering support to amcvideodec

New version of the patch which uses the bufferpool proposed by glimagesink.

Amcvideodec takes care of rendering the surface to an external OES texture which is then converted to a 2D texture / RGBA.

There is still some issue to be addressed:

1) playback is not smooth, it looks like some previous texture are redisplayed. It's even more visible when the gl bufferpool is forced to use only 2 buffers.
Note that the whole rendering (update OES texture -> render to 2D / RGBA) is done within 2ms to 6ms.

2) Using this code under android tutorial-5 (and maybe 4) leads to an EGL_BAD_DISPLAY error. It looks like some GL resources are not released properly.

3) The caps negotiation code still needs to be cleaned improved and splited properly from the set_caps function.
Comment 22 Andoni Morales 2014-06-25 10:46:19 UTC
(In reply to comment #21)
> Created an attachment (id=279210) [details] [review]
> androidmedia: add direct rendering support to amcvideodec
> 
> New version of the patch which uses the bufferpool proposed by glimagesink.
> 
> Amcvideodec takes care of rendering the surface to an external OES texture
> which is then converted to a 2D texture / RGBA.
> 
> There is still some issue to be addressed:
> 
> 1) playback is not smooth, it looks like some previous texture are redisplayed.
> It's even more visible when the gl bufferpool is forced to use only 2 buffers.
> Note that the whole rendering (update OES texture -> render to 2D / RGBA) is
> done within 2ms to 6ms.

The surface texture notifies when the frame is ready with OnFrameAvailableListener, so you need to do the texture copy in this callback. 
http://developer.android.com/reference/android/graphics/SurfaceTexture.OnFrameAvailableListener.html

Add an event listener with JNI is tricky, at least I don't know an easy way of doing it. You will need a subclass of SurfaceTexture in GStreamer.java that listens to OnFrameAvailableListener and call the real C code callback that copies the texture from there.
Comment 23 Andoni Morales 2014-06-25 11:01:05 UTC
Something like:

public class GstAmcVideodecOnFrameAvailableListener implements OnFrameAvailableListener
{
  native void gst_amc_video_dec_frame_available (SurfaceTexture surfaceTexture);

  private synchronized void emitFrameReady (SurfaceTexture surfaceTexture)
  {
    gst_amc_video_dec_frame_available (surfaceTexture);
  }

  @Override
  public void onFrameAvailable(SurfaceTexture surfaceTexture)
  {
    emitFrameReady (surfaceTexture);
  }
}

This makes it a bit more complicated to use the element. When the element is created, you will need to try to create an instance of GstAmcVideodecOnFrameAvailableListener and if it fails disable direct rendering.

Another option options is to delay rendering until the next frame is decoded and do it right before decoding the next frame introducing an extra latency of 1/fps.
Comment 24 Sebastian Dröge (slomo) 2014-06-25 11:23:02 UTC
And some info for integration Java code into the plugin. You can use dalvik.system.DexClassLoader via JNI to load a .dex or .jar file at runtime, which then contains your Java code.

And the .dex/.jar file can be created by
- Compiling with javac at build time, then running dex
- Converting the result into a C source file that is linked into the plugin
- During plugin_init() you write the .dex file to a temporary directory and tell the DexClassLoader about it

Done \o/
Comment 25 Sebastian Dröge (slomo) 2014-10-02 07:24:57 UTC
*** Bug 735778 has been marked as a duplicate of this bug. ***
Comment 26 Sebastian Dröge (slomo) 2014-10-08 07:36:48 UTC
*** Bug 727511 has been marked as a duplicate of this bug. ***
Comment 27 Sebastian Dröge (slomo) 2014-12-02 08:53:41 UTC
*** Bug 741010 has been marked as a duplicate of this bug. ***
Comment 28 Edward Hervey 2015-02-12 15:43:48 UTC
Gave a stab at rebasing/fixing the initial patches.

The result is here : http://cgit.freedesktop.org/~bilboed/gst-plugins-bad/log/?h=amczerocopy-gl-wip

I'm not observing the issues that Mathieu described in comment #21 though.

Finally, this won't work out of the box because of #742924
Comment 29 Matthieu Bouron 2015-03-30 11:35:43 UTC
Here is an updated development branch:
http://cgit.collabora.com/git/user/mateo/gst-plugins-bad.git/log/?h=amczerocopy-gl-wip2

The SurfaceTexture.OnAvailableFrameListener is implemented and the rendering logic has been updated accordingly and follows this worflow:
  1) get the output buffer from the decoder,
  2) render it onto the surface using ReleaseOutputBuffer(TRUE)
  3) wait for the frame to be available out of the surface using the newly implemented listener
  4) convert the related OES texture to a 2D texture and push it downstream

There are new dependencies for the androidmedia plugins to be built:
  * javac, xxd, dx (from the android SDK, which must be in present in $PATH) and the android SDK itself

The android SDK root path can be specified using --with-android-sdk-root=path at configure time. It currently defaults to the env variable $ANDROID_SDK_ROOT.

The android SDK version to use can be specified using --with-android-target=target at configure time. It currently defaults to android-16.
The corresponding android SDK jar will be resolved at $with_android_sdk_target/platforms/$with_android_target/android.jar

The code still needs to be polished though.
Comment 30 Edward Hervey 2015-04-02 12:07:42 UTC
As mentionned on IRC, it would be best to *not* compile that small java interface implementation in gst-plugins-bad.

Instead we should let the application take care of that java part.

1) Emit a message on the bus when the element needs a framelistener with a given surfacetexture (and provide the surfacetexture)
2) Have a property on the element where the application tells it has created a framelistener
2.1) If the application doesn't handle it synchronously, the plugin could switch to the non-surface technique
3) Have an action signal on the element which the application JNI can call whenever the framelistener callback gets called

void *user_function (GstElement *element, jobject surfacetexture);
Comment 31 Sebastian Dröge (slomo) 2015-04-02 13:31:32 UTC
I disagree, we don't really want that all applications have to take care of that. It has to work out of the box :)

And for the two camera sources that currently exist there's exactly the same problem, they also provide that Java interface themselves. And it's not being a problem there at all yet, it just works.


Why do you want the application to take care of that? It should just have to put a glimagesink into the pipeline and set a proper window handle on it, and then it should work IMHO
Comment 32 Sebastian Dröge (slomo) 2015-04-03 03:45:09 UTC
After discussing with Edward earlier, I think the consensus is that it would be nicer not to do this awful bundling with the plugin, and then writing it to a file, etc. :) But also to not require applications to manually include some Java code.

So the remaining idea is that this element (and the camera element I'm currently working on merging), would just install the .java file somewhere (share/gst-android/java/$plugin or something), and cerbero would then collect that during packaging. Now when an application is built, ndk-build would copy those Java files into the application sources if the specific plugin is included in the build... and then just built into the application automatically. Like we already do with the GStreamer.java.
Comment 33 Matthieu Bouron 2015-04-03 10:50:53 UTC
Using this approach will still require the element to ask for the android.app.Activity which will provide the ClassLoader able to load our custom listener.

Do you think it is an acceptable solution ?
Comment 34 Nicolas Dufresne (ndufresne) 2015-04-03 13:44:44 UTC
(In reply to Sebastian Dröge (slomo) from comment #32)
> So the remaining idea is that this element (and the camera element I'm
> currently working on merging), would just install the .java file somewhere
> (share/gst-android/java/$plugin or something), and cerbero would then
> collect that during packaging. Now when an application is built, ndk-build
> would copy those Java files into the application sources if the specific
> plugin is included in the build... and then just built into the application
> automatically. Like we already do with the GStreamer.java.

I've been working with QT for Android, and this approach will simply force these application to bundle and inject Java Classes theme-self. In recent version of QT for Android, you don't write any Java code, and it's hard to add any.
Comment 35 Nicolas Dufresne (ndufresne) 2015-04-03 14:58:31 UTC
Note that you can in native, retrieve that Activities in QT:

http://doc.qt.io/qt-5/qtandroid.html
Comment 36 Andoni Morales 2015-04-04 10:46:02 UTC
We do already force applications to bundle some java code, the one used for the initiallization. Couldn't we use this same piece to define the class with the different listeners needed by our plugins? Or make ndk-build to copy the Java files as proposed before, both would work and it requires no interaction from the user. In case of the hardware decoders they can still work without that, we can post a warning to inform the user if the listener isn't found and error out for the camera plugin.
Comment 37 Nicolas Dufresne (ndufresne) 2015-04-04 14:55:28 UTC
(In reply to Andoni Morales from comment #36)
> We do already force applications to bundle some java code, the one used for
> the initiallization. Couldn't we use this same piece to define the class
> with the different listeners needed by our plugins? Or make ndk-build to
> copy the Java files as proposed before, both would work and it requires no
> interaction from the user. In case of the hardware decoders they can still
> work without that, we can post a warning to inform the user if the listener
> isn't found and error out for the camera plugin.

Yes, and you are forced to re-implement that when using QT For Android. The init part can trivially be done through GResrouces, and we don't really need code generation. If your app is not Java, this implementation forces users into understanding every hack in it.
Comment 38 Sebastian Dröge (slomo) 2015-04-04 18:51:01 UTC
(In reply to Andoni Morales from comment #36)
> We do already force applications to bundle some java code, the one used for
> the initiallization. Couldn't we use this same piece to define the class
> with the different listeners needed by our plugins? Or make ndk-build to
> copy the Java files as proposed before, both would work and it requires no
> interaction from the user. In case of the hardware decoders they can still
> work without that, we can post a warning to inform the user if the listener
> isn't found and error out for the camera plugin.

That was exactly what I proposed a few comments ago, yes ;)


The remark about Qt on Android is valid though, we make the live of people who write Android apps in special ways much harder. The same problem is probably there when using the Xamarin SDK.
Comment 39 Matthieu Bouron 2015-04-05 13:21:21 UTC
(In reply to Sebastian Dröge (slomo) from comment #38)
> (In reply to Andoni Morales from comment #36)
> > We do already force applications to bundle some java code, the one used for
> > the initiallization. Couldn't we use this same piece to define the class
> > with the different listeners needed by our plugins? Or make ndk-build to
> > copy the Java files as proposed before, both would work and it requires no
> > interaction from the user. In case of the hardware decoders they can still
> > work without that, we can post a warning to inform the user if the listener
> > isn't found and error out for the camera plugin.
> 
> That was exactly what I proposed a few comments ago, yes ;)

As i've asked earlier, this solution requires that the activity or the activity class loader to be known by the decoder in order to load the class.

The decoder would need to emit a message of type "android-app-activity", and the application or GStreamer.java would need to handle it and provide the object through a property or maybe using a context mechanism ?

I've started implementing this approach but i'd like to know if you think this is acceptable before continuing because it requires a specific action on the application level.
The first approach (bundling bytecode and which is currently implemented), even with its downside seems less painful to deal with from the application point of view.
Comment 40 Sebastian Dröge (slomo) 2015-04-06 01:35:22 UTC
(In reply to Matthieu Bouron from comment #39)
> (In reply to Sebastian Dröge (slomo) from comment #38)
> > (In reply to Andoni Morales from comment #36)
> > > We do already force applications to bundle some java code, the one used for
> > > the initiallization. Couldn't we use this same piece to define the class
> > > with the different listeners needed by our plugins? Or make ndk-build to
> > > copy the Java files as proposed before, both would work and it requires no
> > > interaction from the user. In case of the hardware decoders they can still
> > > work without that, we can post a warning to inform the user if the listener
> > > isn't found and error out for the camera plugin.
> > 
> > That was exactly what I proposed a few comments ago, yes ;)
> 
> As i've asked earlier, this solution requires that the activity or the
> activity class loader to be known by the decoder in order to load the class.
> 
> The decoder would need to emit a message of type "android-app-activity", and
> the application or GStreamer.java would need to handle it and provide the
> object through a property or maybe using a context mechanism ?
> 
> I've started implementing this approach but i'd like to know if you think
> this is acceptable before continuing because it requires a specific action
> on the application level.
> The first approach (bundling bytecode and which is currently implemented),
> even with its downside seems less painful to deal with from the application
> point of view.

Why do you need the application to do anything? If the class is included in the application, shouldn't it just require the JNI FindClass? We know the name of the class and also its package (org.freedesktop.gstreamer.androidmedia.MediaCodecWhateverThingy).
Comment 41 Matthieu Bouron 2015-04-06 08:37:21 UTC
(In reply to Sebastian Dröge (slomo) from comment #40)
> (In reply to Matthieu Bouron from comment #39)
> > (In reply to Sebastian Dröge (slomo) from comment #38)
> > > (In reply to Andoni Morales from comment #36)
> > > > We do already force applications to bundle some java code, the one used for
> > > > the initiallization. Couldn't we use this same piece to define the class
> > > > with the different listeners needed by our plugins? Or make ndk-build to
> > > > copy the Java files as proposed before, both would work and it requires no
> > > > interaction from the user. In case of the hardware decoders they can still
> > > > work without that, we can post a warning to inform the user if the listener
> > > > isn't found and error out for the camera plugin.
> > > 
> > > That was exactly what I proposed a few comments ago, yes ;)
> > 
> > As i've asked earlier, this solution requires that the activity or the
> > activity class loader to be known by the decoder in order to load the class.
> > 
> > The decoder would need to emit a message of type "android-app-activity", and
> > the application or GStreamer.java would need to handle it and provide the
> > object through a property or maybe using a context mechanism ?
> > 
> > I've started implementing this approach but i'd like to know if you think
> > this is acceptable before continuing because it requires a specific action
> > on the application level.
> > The first approach (bundling bytecode and which is currently implemented),
> > even with its downside seems less painful to deal with from the application
> > point of view.
> 
> Why do you need the application to do anything? If the class is included in
> the application, shouldn't it just require the JNI FindClass? We know the
> name of the class and also its package
> (org.freedesktop.gstreamer.androidmedia.MediaCodecWhateverThingy).

It looks like default FindClass is not able to find our class (org.freedesktop.gstreamer.GstAMCOnFrameAvailableListener in this case) as the default ClassLoader used by the jni env is the system ClassLoader which has no visibily over the classes bundled with the application.
Comment 42 Matthieu Bouron 2015-04-06 13:54:55 UTC
I confirm I get it working using the activity class loader instead of the default class loader.
Comment 43 Sebastian Dröge (slomo) 2015-04-07 00:34:18 UTC
Ok, so what about storing the Activity in GStreamer.init() somewhere in a static variable, and allowing to retrieve it from there? That way the application would only need to provide it there (it does already), and not to every random plugin out there.

Still leaves the annoyance for non-Java apps. Edward, your opinions?
Comment 44 Edward Hervey 2015-04-07 07:15:03 UTC
The part about non-Java apps leaves me a bit ... dubious. How can you have non-java apps on Android *unless* you are doing "system" applications ? Any app *will* need a java loader (albeit as small as possible).

As for storing/using the activity in GStreamer.init() it makes sense imho.
Comment 45 Matthieu Bouron 2015-04-07 08:04:39 UTC
(In reply to Edward Hervey from comment #44)
> The part about non-Java apps leaves me a bit ... dubious. How can you have
> non-java apps on Android *unless* you are doing "system" applications ? Any
> app *will* need a java loader (albeit as small as possible).
> 
> As for storing/using the activity in GStreamer.init() it makes sense imho.

I'm not sure passing the activity to Gstreamer.init and stores it as a static variable will really help unless some kind of C api is provided by gstreamer-android to retreive the activity from there and also because the application (activity) is currently stored in CustomData.app (it is already done for all android tutorials, the code is not shared though).
Maybe i'm missing something here.

What is currently done is that the decoder emits a message "android-app-activity" and the application handles it synchronously and sets the activity using a property.
Is there a better way to achieve this ?
Comment 46 Sebastian Dröge (slomo) 2015-04-07 16:01:37 UTC
(In reply to Edward Hervey from comment #44)
> The part about non-Java apps leaves me a bit ... dubious. How can you have
> non-java apps on Android *unless* you are doing "system" applications ? Any
> app *will* need a java loader (albeit as small as possible).

Xamarin and Qt are hiding all that from the application.

(In reply to Matthieu Bouron from comment #45)
> (In reply to Edward Hervey from comment #44)
> > The part about non-Java apps leaves me a bit ... dubious. How can you have
> > non-java apps on Android *unless* you are doing "system" applications ? Any
> > app *will* need a java loader (albeit as small as possible).
> > 
> > As for storing/using the activity in GStreamer.init() it makes sense imho.
> 
> I'm not sure passing the activity to Gstreamer.init and stores it as a
> static variable will really help unless some kind of C api is provided by
> gstreamer-android to retreive the activity from there and also because the
> application (activity) is currently stored in CustomData.app (it is already
> done for all android tutorials, the code is not shared though).
> Maybe i'm missing something here.

Why can't we get org.freedesktop.gstreamer.GStreamer via JNI's FindClass() and then directly use the field from there? It's clearly loaded at this point as the application called it :)
Alternatively we could also add C API to gstreamer_android.c that gives the Activity that was stored there during init.
Comment 47 Matthieu Bouron 2015-04-10 08:52:10 UTC
(In reply to Sebastian Dröge (slomo) from comment #46)
> (In reply to Edward Hervey from comment #44)
> > The part about non-Java apps leaves me a bit ... dubious. How can you have
> > non-java apps on Android *unless* you are doing "system" applications ? Any
> > app *will* need a java loader (albeit as small as possible).
> 
> Xamarin and Qt are hiding all that from the application.
> 
> (In reply to Matthieu Bouron from comment #45)
> > (In reply to Edward Hervey from comment #44)
> > > The part about non-Java apps leaves me a bit ... dubious. How can you have
> > > non-java apps on Android *unless* you are doing "system" applications ? Any
> > > app *will* need a java loader (albeit as small as possible).
> > > 
> > > As for storing/using the activity in GStreamer.init() it makes sense imho.
> > 
> > I'm not sure passing the activity to Gstreamer.init and stores it as a
> > static variable will really help unless some kind of C api is provided by
> > gstreamer-android to retreive the activity from there and also because the
> > application (activity) is currently stored in CustomData.app (it is already
> > done for all android tutorials, the code is not shared though).
> > Maybe i'm missing something here.
> 
> Why can't we get org.freedesktop.gstreamer.GStreamer via JNI's FindClass()
> and then directly use the field from there? It's clearly loaded at this
> point as the application called it :)

It is not a matter of the class being loaded or not. It just depends on the class loader that is going to be used. The class loader used by default by the JNI env is the system class loader and it has no visibility over the classes bundled by the application. The class loader provided by the application/activity must be used instead if we want to load our classes.

So I think we need to add some kind of API at the gstreamer level on which the decoder could depend, and use to retreive the activity or activity class loader.

The application would be in charge of providing it through this API.
The application will also have to take care of providing a new object reference each time the JNI thread has changed as the old object reference in the C world won't be valid anymore.

What do you think ?
Which solution should we focus on ?
Comment 48 Andoni Morales 2015-04-10 15:18:43 UTC
If we don't want any user interaction, all the jniutils could be moved into a new library with this new API:

gst_jni_set_application_class_loader (jobject class_loader);

gstreamer_android.c in JNI_OnLoad retrieves the ClassLoader for the GStreamer class using Class.getClassLoader() and caches it with this new API.

Classes are now searched using the cached class loader instance instead of the env's FindClass:
CallObjectMethod(class_loader, find_class_method, "org/bla/GStreamer")
Comment 49 Matthieu Bouron 2015-04-10 16:16:05 UTC
(In reply to Andoni Morales from comment #48)
> If we don't want any user interaction, all the jniutils could be moved into
> a new library with this new API:
> 
> gst_jni_set_application_class_loader (jobject class_loader);
> 
> gstreamer_android.c in JNI_OnLoad retrieves the ClassLoader for the
> GStreamer class using Class.getClassLoader() and caches it with this new API.
> 
> Classes are now searched using the cached class loader instance instead of
> the env's FindClass:
> CallObjectMethod(class_loader, find_class_method, "org/bla/GStreamer")

This is what I was thinking of, so +1 for this approach :)
Comment 50 Sebastian Dröge (slomo) 2015-04-11 16:54:41 UTC
I like that too, but why does this need app interaction at all? The app already calls GStreamer.init(context) already, and from there we can get the class loader.
Comment 51 Sebastian Dröge (slomo) 2015-04-11 16:56:28 UTC
Only leaves the non-Java apps problem, for which we could make the gstreamer_android.c API public and make it a gst_android_init(jobject context) that internally does all the same things. The app would still have to include some Java stuff in the build, but that shouldn't really be a problem?
Comment 52 Andoni Morales 2015-04-11 20:14:23 UTC
(In reply to Sebastian Dröge (slomo) from comment #51)
> Only leaves the non-Java apps problem, for which we could make the
> gstreamer_android.c API public and make it a gst_android_init(jobject
> context) that internally does all the same things. The app would still have
> to include some Java stuff in the build, but that shouldn't really be a
> problem?

+1, this would leave gstreamer_android.c just with the code that needs to be generated in the ndk-build process (plugins and gio modules registration) and the rest moved to the new library into the new initialization function.
Comment 53 Sebastian Dröge (slomo) 2015-04-11 20:22:44 UTC
Why do we need a library btw? Just keeping things as is for now should be fine, and adding some native API to gstreamer_android.c and a corresponding header.
Comment 54 Andoni Morales 2015-04-11 20:35:25 UTC
Sorry, I meant splitting gstreamer_android.c into plugins and gio modules registration in one side and the initialization in the other side.

The class loader needs to be shared somehow with the plugin and if we don't want any app interaction, there needs to be a library in the middle where the app caches the class loader and the plugin can use it since. Although it would be nice to be able to do it in a different way, maybe a global GstContext :)
Comment 55 Sebastian Dröge (slomo) 2015-04-11 20:41:55 UTC
The plugin and gstreamer_android.c are both linked together into libgstreamer_android.so. So we could just use the stuff directly (or dlsym()) :) Not beautiful, but adding another micro-library just for this seems a bit extreme
Comment 56 Matthieu Bouron 2015-04-13 22:16:26 UTC
Here are the two development branches:
  * cebero/gst-android: http://cgit.collabora.com/git/user/mateo/cerbero.git/commit/?h=archlinux%2bandroid
  * gst-plugins-bad: http://cgit.collabora.com/git/user/mateo/gst-plugins-bad.git/commit/?h=amczerocopy-gl-wip3

gst-android now stores a global reference of the context and class loader and provides two methods gst_android_get_{context|class_loader}.

gst-android includes GstAmcVideoOnFrameAvailableListener.java installed by gst-plugins-bad in /usr/share/gst-android/ndk-build.

During initialization, gstjniutils dlsym gst_android_get_class_loader. It is then used in gst_amc_jni_get_application_class.

The latest commit of the amczerocopy-gl-wip3 branch which implements this logic is kept separate for now for maintenance and will be merged later with the big direct rendering patch.
Comment 57 Sebastian Dröge (slomo) 2015-04-14 07:58:14 UTC
Looks good, now this only misses the ndk-build magic to include these .java files in the build if the plugin is included (thus install the .java file in gst-android/ndk-build/$plugin) :)
Comment 58 Matthieu Bouron 2015-04-14 10:15:36 UTC
(In reply to Sebastian Dröge (slomo) from comment #57)
> Looks good, now this only misses the ndk-build magic to include these .java
> files in the build if the plugin is included (thus install the .java file in
> gst-android/ndk-build/$plugin) :)

Branch updated, GstAmcOnFrameAvailableListener.java is only copied if androidmedia is going to be built.
Comment 59 Sebastian Dröge (slomo) 2015-04-14 10:21:34 UTC
I was more thinking of something like

foreach plugin in plugins:
  if exists("gst-android/ndk-build/$plugin"):
    cp "gst-android/ndk-build/$plugin/*.java" $dest


That way it would be kept extensible for other plugins and we don't add any plugin specific magic.
Comment 60 Matthieu Bouron 2015-04-14 11:12:48 UTC
(In reply to Sebastian Dröge (slomo) from comment #59)
> I was more thinking of something like
> 
> foreach plugin in plugins:
>   if exists("gst-android/ndk-build/$plugin"):
>     cp "gst-android/ndk-build/$plugin/*.java" $dest
> 
> 
> That way it would be kept extensible for other plugins and we don't add any
> plugin specific magic.

Branch updated. The only issue I see with the current code is if it happens that two plugins declares both a class with the same name.

But I'm not sure if it's really an issue for now.
Comment 61 Sebastian Dröge (slomo) 2015-04-14 11:32:47 UTC
Each plugin should use its own namespace, e.g. org.freedesktop.gstreamer.androidmedia
Comment 62 Matthieu Bouron 2015-04-14 13:13:55 UTC
Both branches updated accordingly: java classes are installed in /usr/share/gst-android/ndk-build/$(plugin_name) and gstreamer-1.0.mk installs them under src/org/freedesktop/gstreamer/$(plugin_name).
Comment 63 Matthieu Bouron 2015-05-01 10:18:54 UTC
Created attachment 302703 [details] [review]
[PATCH 1/9] androidmedia: Do not warn we do not support COLOR_FormatSurface
Comment 64 Matthieu Bouron 2015-05-01 10:19:21 UTC
Created attachment 302704 [details] [review]
[PATCH 2/9] androidmedia: Improve debug messages
Comment 65 Matthieu Bouron 2015-05-01 10:19:47 UTC
Created attachment 302705 [details] [review]
[PATCH 3/9] androidmedia: Improve color format debug messages
Comment 66 Matthieu Bouron 2015-05-01 10:20:15 UTC
Created attachment 302706 [details] [review]
[PATCH 4/9] androidmedia: Fix debug statement (%d for a gsize argument)
Comment 67 Matthieu Bouron 2015-05-01 10:20:42 UTC
Created attachment 302707 [details] [review]
[PATCH 5/9] androidmedia: Add support for libart and add more debugging
Comment 68 Matthieu Bouron 2015-05-01 10:21:11 UTC
Created attachment 302708 [details] [review]
[PATCH 6/9] androidmedia: Allow object to be NULL in gst_amc_jni_*unref functions
Comment 69 Matthieu Bouron 2015-05-01 10:21:37 UTC
Created attachment 302709 [details] [review]
[PATCH 7/9] androidmedia: Do not flush codec if it is not started
Comment 70 Matthieu Bouron 2015-05-01 10:22:09 UTC
Created attachment 302710 [details] [review]
[PATCH 8/9] androidmedia: Add support for direct rendering in amcvideodec
Comment 71 Matthieu Bouron 2015-05-01 10:22:34 UTC
Created attachment 302711 [details] [review]
[PATCH 9/9] androidmedia: Only support direct rendering for codec that uses unsupported color formats
Comment 72 Matthieu Bouron 2015-05-01 11:26:32 UTC
Development branch can be found at:
http://cgit.collabora.com/git/user/mateo/gst-plugins-bad.git/log/?h=amczerocopy-gl-wip3

Note that direct rendering will be disable if amcvideodec is used under decodebin because of #742924 as the negotiation fail.
Comment 73 Olivier Crête 2015-05-01 16:52:32 UTC
Review of attachment 302708 [details] [review]:

Are unreffing NULL values something java does?!?
Comment 74 Matthieu Bouron 2015-05-01 19:09:56 UTC
(In reply to Olivier Crête from comment #73)
> Review of attachment 302708 [details] [review] [review]:
> 
> Are unreffing NULL values something java does?!?

Nope I added the check for convenience (so the caller does not have to do the check) and I realize that It does not make much sense in the end.
Comment 75 Matthieu Bouron 2015-05-01 23:27:50 UTC
(In reply to Matthieu Bouron from comment #74)
> (In reply to Olivier Crête from comment #73)
> > Review of attachment 302708 [details] [review] [review] [review]:
> > 
> > Are unreffing NULL values something java does?!?
> 
> Nope I added the check for convenience (so the caller does not have to do
> the check) and I realize that It does not make much sense in the end.

I just took a look at gst_object_unref and it allows the object to be NULL whereas g_object_unref does not.
What is your opinion about it ?
Comment 76 Tim-Philipp Müller 2015-05-01 23:52:38 UTC
> I just took a look at gst_object_unref and it allows the object to be NULL
> whereas g_object_unref does not.

[*] Citation needed, me thinks:

void
gst_object_unref (gpointer object)
{
  g_return_if_fail (object != NULL);
  ...
}
Comment 77 Matthieu Bouron 2015-05-03 11:53:47 UTC
Created attachment 302803 [details] [review]
[PATCH 6/9] androidmedia: Allow object to be NULL in gst_amc_jni_*unref functions

Patch updated, changed the if (object) check to g_return_if_fail (object != NULL).
Comment 78 Matthieu Bouron 2015-05-03 11:55:23 UTC
Created attachment 302804 [details] [review]
[PATCH 8/9] androidmedia: Add support for direct rendering in amcvideodec

Patch updated, remove warnings when converting jlong to void* and the other way around.
Comment 79 Matthieu Bouron 2015-05-03 11:57:14 UTC
Created attachment 302805 [details] [review]
[PATCH 9/9] androidmedia: Only support direct rendering for codec that uses unsupported color formats

Patch updated, keep GST_AMC_IGNORE_UNKNOWN_COLOR_FORMATS option so the decoder can still be used if downstream does not support direct rendering and the option is enabled.
Comment 80 cee1 2015-05-04 08:13:19 UTC
If goes to the direct rendering path, the amcvideodec is not only a decoder, but also a videosink, do I understand it right?
Comment 81 Matthieu Bouron 2015-05-04 10:39:50 UTC
(In reply to cee1 from comment #80)
> If goes to the direct rendering path, the amcvideodec is not only a decoder,
> but also a videosink, do I understand it right?

Well, it's not a true direct rendering, the patchset allows the decoder to create its own surface and then do the rendering to GL 2D textures which are pushed downstream (if downstream supports it).
Comment 82 Matthieu Bouron 2015-05-04 11:42:06 UTC
Created attachment 302854 [details] [review]
[PATCH 8/9] androidmedia: Add support for GL output in amcvideodec

Patch updated:
  * rename commit message and replace occurrences of "direct rendering" with "GL output".
  * self->downstream_supports_dr -> self->downstream_supports_gl
Comment 83 Matthieu Bouron 2015-05-04 11:43:47 UTC
Created attachment 302855 [details] [review]
[PATCH 9/9] Only allow GL output if the decoder has unknown color formats

Patch updated:
  * Makes the commit message a bit clearer
  * Rename direct_rendering_only attributes by gl_output_only
Comment 84 Matthieu Bouron 2015-05-19 11:36:27 UTC
Created attachment 303586 [details] [review]
[PATCH 8/9] androidmedia: Add support for GL output in amcvideodec

Patch updated:
  * fix indentation and shader variables names in gstamc2dtexturerenderer.c
Comment 85 Matthieu Bouron 2015-05-19 11:37:34 UTC
Created attachment 303587 [details] [review]
[PATCH 9/9] Only allow GL output if the decoder has unknown color formats

Patch updated:
  * reword "Codec only supports GL output but negociation failed" with "Codec only supports GL output but downstream does not"
Comment 86 Matthieu Bouron 2015-05-19 11:39:17 UTC
Development branch is now located at https://github.com/mbouron/gst-plugins-bad/tree/amczerocopy-gl
Comment 87 Matthieu Bouron 2015-05-19 12:43:56 UTC
Here is the current status of the patchset:
  * the feature is working on broadcom devices (nexus 4 and 5 on android >= 4.4)

  * gl output is not properly negotiated under decodebin (#742924) and thus the decoder falls back to main memory output if amcvideodec supports all the relevant color formats (and it is unlikely as android 5.0 introduces new color formats such as COLOR_QCOM_FormatYVU420SemiPlanar32mMultiView). To work around this issue GST_AMC_IGNORE_UNKNOWN_COLOR_FORMATS should set to "yes".

  * an user reported that the patchset was not working on a tegra2 device (whereas it was working on his broadcom device). I suspect the issue to be on the GL side but I can't tell much as I don't have this kind of device. I'm trying to debug the issue remotely but I don't know how many time it will take.
Comment 88 Mathieu Duponchelle 2015-07-22 14:33:36 UTC
I haven't been able to make that patchset work on the two different devices I tested it with, the first a 2012 nexus 7, the second a galaxy core prime (http://www.gsmarena.com/samsung_galaxy_core_prime-6716.php)

When using decodebin, decoding and playback work as glimagesink and decodebin end up "wrongly" negotiating to system memory (first attachment, on the nexus), but when trying to recreate the pipeline manually (second attachment, still on the nexus), the pipeline errors out with this message:

E/GStreamer+amc( 8420): 0:00:01.129697000 0x687e4430 gstamc.c:2036:gst_amc_color_format_info_set Unsupported color format 256
E/GStreamer+amcvideodec( 8420): 0:00:01.129798000 0x687e4430 gstamcvideodec.c:640:gst_amc_video_dec_set_src_caps:<amcvideodec-omxnvidiah264decode0> Failed to set up GstAmcColorFormatInfo
D/GStreamer+tutorial-3( 8420): 0:00:01.147285001 0x684c1200 /home/meh/devel/gst-sdk-tutorials/gst-sdk/tutorials/android-tutorial-3/jni/tutorial-3.c:88:set_ui_message Setting message to: Error received from element amcvideodec-omxnvidiah264decode0: GStreamer encountered a general supporting library error.

I initially thought this could be due to the version of Android I had on the device (4.2), but the error remained after upgrading to 4.3 then 4.4.

On the galaxy core prime, the error is different, and I see these messages :

I/OMXNodeInstance(20338): StatusFromOMX default is (0x80001001)
E/OMXNodeInstance(20338): OMX_GetExtensionIndex OMX.google.android.index.storeMetaDataInBuffers failed
I/OMXNodeInstance(20338): StatusFromOMX default is (0x80001001)
E/ACodec  (20338): [OMX.google.h264.decoder] storeMetaDataInBuffers failed w/ err -2147483648
I/ACodec  (20338): DRC Mode: Port Reconfig Mode
I/OMX     (20338): width (1280), height (720), fps (0)
I/ACodec  (20338): [OMX.google.h264.decoder] Now Loaded->Idle
I/ACodec  (20338): [OMX.google.h264.decoder] Now Idle->Executing
I/ACodec  (20338): [OMX.google.h264.decoder] Now Executing
E/SoftAVC (20338): Decoder failed: -2
E/ACodec  (20338): [OMX.google.h264.decoder] ERROR(0x80001001)
E/MediaCodec(20338): Codec reported an error. (omx error 0x80001001, internalError -2147483648)
E/GStreamer+amcvideodec(20338): 0:00:01.597871718 0x61c17a30 gstamcvideodec.c:1653:gst_amc_video_dec_handle_frame:<amcvideodec-omxgoogleh264decoder0> Failed to dequeue input buffer
E/GStreamer+amcvideodec(20338): 0:00:01.602675208 0x633a7b20 gstamcvideodec.c:799:gst_amc_video_dec_loop:<amcvideodec-omxgoogleh264decoder0> Failure dequeueing output buffer
D/GStreamer+tutorial-3(20338): 0:00:01.605207864 0x61c2d920 /home/meh/devel/gst-sdk-tutorials/gst-sdk/tutorials/android-tutorial-3/jni/tutorial-3.c:88:set_ui_message Setting message to: Error received from element amcvideodec-omxgoogleh264decoder0: GStreamer encountered a general supporting library error.
E/GStreamer+amcvideodec(20338): 0:00:01.606267551 0x61c17a30 gstamcvideodec.c:1858:gst_amc_video_dec_drain:<amcvideodec-omxgoogleh264decoder0> Invalid input buffer index -2147483648 of 8

I have investigated a bit more, and using "grafika" (https://github.com/google/grafika/tree/master/src/com/android/grafika) I was about to play video with the releaseOutputBuffer (do_render=True) technique, in a surface, with the same mp4/h264 video I was using for my testing. On the nexus, the color format is also 256, after their call to getOutputFormat, but that does not seem to be a problem, and it gets correctly rendered in a surface.

As this patchset would break playback on both these devices under decodebin once #742924 is fixed, I think it isn't ready for merging, do these errors evoke anything to you Mathieu ?
Comment 89 Mathieu Duponchelle 2015-07-22 14:35:27 UTC
Created attachment 307910 [details]
Svg of the decodebin pipeline on the nexus, using System Memory
Comment 90 Mathieu Duponchelle 2015-07-22 14:36:34 UTC
Created attachment 307911 [details]
Svg of the failing manually constructed pipeline, on the nexus
Comment 91 Matthew Waters (ystreet00) 2015-09-25 05:59:51 UTC
Rebased patches are available from https://github.com/ystreet/gst-plugins-bad/tree/amc-zerocopy

Most things seem to work in static pipelines (i.e. no playbin) already on an LG G3 although there seems to be frame corruption when using higher resolutions (e.g. 1080p) which points to some synchonisation issue/s somewhere.
Comment 92 Matthew Waters (ystreet00) 2015-10-16 13:19:33 UTC
Ok, so the updated branch https://github.com/ystreet/gst-plugins-bad/tree/amc-zerocopy along with the patch in bug 742924 makes the zerocopy rendering work for me.  The synchronization issues were most likely due to amcvideodec creating it's own GL context and the lack of synchronisation with the downstream GL context.  That's since been removed.  We can add it back if anyone is partial to it.
Comment 93 Sebastian Dröge (slomo) 2015-10-19 11:51:26 UTC
The commits in the github branch look good to me, let's get this merged to give it wider testing? :) I think this also needs bug #755173 to be merged.
Comment 94 Matthew Waters (ystreet00) 2015-10-20 17:55:45 UTC
commit 70be81ad922195edba469dface08637136edd4a9
Author: Matthew Waters <matthew@centricular.com>
Date:   Fri Oct 16 07:03:06 2015 +1100

    glimagesink: create a context in NULL_READY
    
    So that it's possible for decoders et al. to request the OpenGL context
    in their READY_PAUSED transition with decodebin/playbin.

commit c322806227064db6d64cae98d5918792d2dd5400
Author: Matthew Waters <matthew@centricular.com>
Date:   Fri Oct 16 00:34:22 2015 +1100

    amcviddec: use gstcontext to retreive the OpenGL context

commit 7dbb6681a32dedc890bba1ba17ddd714e35031b0
Author: Matthieu Bouron <matthieu.bouron@collabora.com>
Date:   Mon Apr 20 13:46:58 2015 +0200

    androidmedia: Only allow GL output if the decoder has unknown color formats
    
    If GST_AMC_IGNORE_UNKNOWN_COLOR_FORMATS is set to yes, non-GL output
    is still allowed.
    
    https://bugzilla.gnome.org/show_bug.cgi?id=731204

commit 43b63f304d3ee6ab9b89940c34574b40878a4915
Author: Matthieu Bouron <matthieu.bouron@collabora.com>
Date:   Thu Jun 5 10:33:56 2014 +0200

    androidmedia: Add support for GL output in amcvideodec
    
    https://bugzilla.gnome.org/show_bug.cgi?id=731204

commit 45e287840d3f325e96fe01f4ceea799936532529
Author: Matthieu Bouron <matthieu.bouron@collabora.com>
Date:   Thu Apr 30 12:33:58 2015 +0200

    androidmedia: Do not flush codec if it is not started

commit 6ca0be038a46d43b0b1a39ac56ea9c6bf1f420a3
Author: Matthieu Bouron <matthieu.bouron@gmail.com>
Date:   Mon Apr 13 13:10:10 2015 +0200

    androidmedia: Allow object to be NULL in gst_amc_jni_*unref functions

commit 88cd44fbd899cc2f900a0a0e840106921ffc9db7
Author: Edward Hervey <bilboed@bilboed.com>
Date:   Fri Mar 13 16:13:08 2015 +0100

    androidmedia: Fix debug statement (%d for a gsize argument)

commit 4ab1e66b2e778fd12f8a455c3c4378ac7173d336
Author: Matthieu Bouron <matthieu.bouron@gmail.com>
Date:   Thu Apr 2 16:28:14 2015 +0200

    androidmedia: Improve color format debug messages

commit 8c46a7704a6cae00dd32ecbb880a312545558317
Author: Matthieu Bouron <matthieu.bouron@gmail.com>
Date:   Tue Mar 31 16:24:40 2015 +0200

    androidmedia: Improve debug messages

commit a038478f1b1dfdf9f8dc4a02d6b47d954a9fcf9f
Author: Matthieu Bouron <matthieu.bouron@gmail.com>
Date:   Tue Mar 31 17:48:59 2015 +0200

    androidmedia: Do not warn we do not support COLOR_FormatSurface