Video mapping on Jetson TX1

Hello everyone,

I am trying to do a simple video mapping software that reads a video from a file, maps it on a predefined shape and displays the result with a video projector.
For now, I can read the video file with gstreamer1.0 (v1.8) and retrieve the video frames in openGL thanks to the glimagesink element. But, with a 4K video, I can’t reach 60FPS : the video is capped at 30-35 FPS. It seems that it is due to the YUV to RGB conversion (with glcolorconvert element) in the gestreamer pipeline.

I am completely new to this, so my question is : Is there a better way to do what I want to do (maybe without openGL that need RGB frames and some NVIDIA’s tools instead) ? Is it even possible with the jetson TX1 ?

Any suggestion will be appreciated. Thanks in advance,

Clement

I do not know if that is your bottleneck, but if it is have you tried using nvvidconv instead of glcolorconvert? It should be more efficient.

Thanks for you reply,

I have tried with a pipeline like : filesrc location=<my_movie.mp4> ! qtdemux ! h264parse ! omxh264dec ! nvvidconv ! ‘video/x-raw, format=(string)RGBA’ ! glimagesink but it ends with :

ERROR: from element /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0: Internal data stream error.
Additional debug info:
/dvs/git/dirty/git-master_linux/external/gstreamer/gst-omx/omx/gstomxvideodec.c(2854): gst_omx_video_dec_loop (): /GstPipeline:pipeline0/GstOMXH264Dec-omxh264dec:omxh264dec-omxh264dec0:
stream stopped, reason error
ERROR: pipeline doesn't want to preroll.
Setting pipeline to NULL ...

I don’t understand, ‘video/x-raw’ and ‘format=(string)RGBA’ are correct capabilities for both glimagesink’s sink pad and nvvidconv’s source pad.

Hi Clement,
Please try the following command. You should be able to get get ‘video/x-raw(memory:EGLImage),format=(string)RGBA’ buffer after nvegltransform

export DISPLAY=:0

gst-launch-1.0 filesrc location=~/b.mp4 ! qtdemux ! h264parse ! omxh264dec ! nvvidconv ! ‘video/x-raw(memory:NVMM),format=(string)NV12’ ! nvegltransform ! ‘video/x-raw(memory:EGLImage),format=(string)RGBA’ ! nveglglessink window-x=100 window-y=100

Hi DaneLLL,

Thanks, the color conversion works like a charm, but with nvegltransform can I use the glimagesink element to map the video frames as I used to do ?
For now, when I replace nveglglessink by glimagesink I end up with the error :

0:00:01.331795592 16364       0x560680 ERROR         eglimagememory steglimagememory.c:721:gst_egl_image_memory_setup_buffer: Failed to create EGLImage
0:00:01.332084441 16364   0x7f680049e0 INFO            videodecoder gstvideodecoder.c:1334:gst_video_decoder_sink_event_default:<omxh264dec-omxh264dec0> upstream tags: taglist, video-codec=(string)"H.264\ \(High\ Profile\)", maximum-bitrate=(uint)35117824, bitrate=(uint)8002648, minimum-bitrate=(uint)267840;
0:00:01.332407925 16364       0x560680 WARN            glbufferpool gstglbufferpool.c:319:gst_gl_buffer_pool_alloc:<glbufferpool1> Could not create EGLImage Memory
0:00:01.332599381 16364       0x560680 WARN              bufferpool gstbufferpool.c:300:do_alloc_buffer:<glbufferpool1> alloc function failed
0:00:01.332639484 16364       0x560680 WARN              bufferpool gstbufferpool.c:333:default_start:<glbufferpool1> failed to allocate buffer
0:00:01.332680734 16364       0x560680 ERROR             bufferpool gstbufferpool.c:531:gst_buffer_pool_set_active:<glbufferpool1> start failed
0:00:01.332714014 16364       0x560680 WARN           basetransform gstbasetransform.c:1725:default_prepare_output_buffer:<egltransform> error: failed to activate bufferpool
0:00:01.332753389 16364       0x560680 WARN           basetransform gstbasetransform.c:1725:default_prepare_output_buffer:<egltransform> error: failed to activate bufferpool

Hi Clement,
nvegltransform outputs GstEGLImageMemory , which seems not supported by glimagesink.
Is it possible to use nveglglessink in your usecase? Could you share the gst command line of your usecase?

Hi,

Once the video frames have been decoded I want to map them on a model. With openGL it is easy to use the video frames as texture and glimagesink has a “client-draw” signal that I can use to trigger the mapping callback. So my usecase was :
gst-launch-1.0 filesrc location=<my_file.mp4> ! qtdemux ! h264parse ! omxh264dec ! glimagesink but performances were not so good (the FPS decreases after the glcolorconvert element in glimagesink).
nvvidconv and nvegltransform solve the problem of color conversion but I still need something to modify the video frames. I don’t know if I can use nveglglessink for that purpose.

In “Pad Templates” section there is “video/x-raw(memory:EGLImage)”, does it mean that glimagesink should support GstEGLImageMemory ? Here is the output of gst-inspect-1.0 glimagesink :

root@tegra-ubuntu:~# /opt/gstreamer/bin/gst-inspect-1.0 glimagesink
Factory Details:
  Rank                     secondary (128)
  Long-name                GL Sink Bin
  Klass                    Sink/Video
  Description              Infrastructure to process GL textures
  Author                   Matthew Waters <matthew@centricular.com>

Plugin Details:
  Name                     opengl
  Description              OpenGL plugin
  Filename                 /opt/gstreamer/lib/aarch64-linux-gnu/gstreamer-1.0/libgstopengl.so
  Version                  1.8.3
  License                  LGPL
  Source module            gst-plugins-bad
  Source release date      2016-08-19
  Binary package           GStreamer Bad Plug-ins source release
  Origin URL               Unknown package origin

GObject
 +----GInitiallyUnowned
       +----GstObject
             +----GstElement
                   +----GstBin
                         +----GstGLSinkBin
                               +----GstGLImageSinkBin

Implemented Interfaces:
  GstChildProxy
  GstVideoOverlay
  GstNavigation
  GstColorBalance

Pad Templates:
  SINK template: 'sink'
    Availability: Always
    Capabilities:
      video/x-raw(memory:GLMemory, meta:GstVideoOverlayComposition)
                 format: { RGBA, BGRA, RGBx, BGRx, ARGB, ABGR, xRGB, xBGR, RGB, BGR, RGB16, BGR16, AYUV, I420, YV12, NV12, NV21, YUY2, UYVY, Y41B, Y42B, Y444, GRAY8, GRAY16_LE, GRAY16_BE }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(memory:EGLImage, meta:GstVideoOverlayComposition)
                 format: RGBA
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(memory:SystemMemory, meta:GstVideoOverlayComposition)
                 format: { RGBA, BGRA, RGBx, BGRx, ARGB, ABGR, xRGB, xBGR, RGB, BGR, RGB16, BGR16, AYUV, I420, YV12, NV12, NV21, YUY2, UYVY, Y41B, Y42B, Y444, GRAY8, GRAY16_LE, GRAY16_BE }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(meta:GstVideoGLTextureUploadMeta, meta:GstVideoOverlayComposition)
                 format: RGBA
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(memory:GLMemory)
                 format: { RGBA, BGRA, RGBx, BGRx, ARGB, ABGR, xRGB, xBGR, RGB, BGR, RGB16, BGR16, AYUV, I420, YV12, NV12, NV21, YUY2, UYVY, Y41B, Y42B, Y444, GRAY8, GRAY16_LE, GRAY16_BE }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(memory:EGLImage)
                 format: RGBA
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw
                 format: { RGBA, BGRA, RGBx, BGRx, ARGB, ABGR, xRGB, xBGR, RGB, BGR, RGB16, BGR16, AYUV, I420, YV12, NV12, NV21, YUY2, UYVY, Y41B, Y42B, Y444, GRAY8, GRAY16_LE, GRAY16_BE }
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]
      video/x-raw(meta:GstVideoGLTextureUploadMeta)
                 format: RGBA
                  width: [ 1, 2147483647 ]
                 height: [ 1, 2147483647 ]
              framerate: [ 0/1, 2147483647/1 ]

Hi Clement,
Currently glimagesink is not supported with gst-omx plugins & nveglglesink is the default sink plugin for all EGLImage based accelerated render path.

For your case, we suggest run
gst-launch-1.0 filesrc location=<filename.mp4> ! qtdemux name=demux ! h264parse ! omxh264dec ! nvivafilter customer-lib-name=libnvsample_cudaprocess.so cuda-process=true post-process=true ! ‘video/x-raw(memory:NVMM),format=(string)RGBA’ ! nvegltransform ! nveglglessink -e
or
gst-launch-1.0 filesrc location=<filename.mp4> ! qtdemux name=demux ! h264parse ! omxh264dec ! nvivafilter customer-lib-name=libnvsample_cudaprocess.so cuda-process=true post-process=true ! ‘video/x-raw(memory:NVMM),format=(string)RGBA’ ! nvoverlaysink display-id=1 -e

Above pipeline can support both colorspace conversion & CPU/GPU access for any pre/post/cuda operation on the decoded frame.

Please refer to the following links for details

https://developer.nvidia.com/embedded/dlc/l4t-24-2-sources

Hi Clement,

In addition to last comment from Dane,

You can refer “nvsample_cudaprocess_src.tbz2”(l4t-24-2-sources) for the sample implementation of the libnvsample_cudaprocess.so.
Sample implementation of pre/post/cuda operations can be replaced with your custom implementation on decoded frame. (Refer nvsample_cudaprocess_README.txt for the interface API details)

-Regards

Adding tee to the pipeline should work:
filesrc location=<my_movie.mp4> ! qtdemux ! h264parse ! omxh264dec ! nvvidconv ! ‘video/x-raw, format=(string)RGBA’ ! tee ! glimagesink