nvvidconv plugin and v4l2 camera problem :(

After all the problems I had and solved with the CSI cameras, I finally managed to make a working driver for the ov5640 camera, that is using the V4l2 framework. Cool about this camera is that it can output fullHD (1920x1080)@30fps in UYVU format. And all I want from the TX1 now is to be able to compress the video stream and possibly stream it over the network.

All this should be fine, except that the nvvidconv plugin makes things impossible. I cannot make it work with output from the camera.

So I verify the camera is working with this pipeline:
BOARD:

gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)60/1' ! videoconvert ! 'video/x-raw, format=(string)NV12' ! omxh264enc ! h264parse ! rtph264pay ! udpsink host=192.168.11.10 port=5001

PC:

CAPS=...

TO get CAPS run command of BOARD with -v option and copy udpsink0 caps. Remove the “”

gst-launch-1.0 --gst-debug=0 udpsrc port=5001 ! $CAPS ! rtph264depay ! h264parse ! avdec_h264 ! autovideosink sync=true

But this uses SW conversion, and frame rate keeps being low (opposite to the 100% usage on one of the cores).


I can also check the nvvidconv is doing something
BOARD:

gst-launch-1.0 -v videotestsrc ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! h264parse ! rtph264pay ! udpsink host=192.168.11.10 port=5001

This is also working.


But when I replace videoconvert with nvvidconv I get a mysterious error:

gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! h264parse ! rtph264pay ! udpsink host=192.168.11.10 port=5001
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = "video/x-raw\(memory:NVMM\)\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ framerate\=\(fraction\)30/1\,\ width\=\(int\)1920\,\ height\=\(int\)1080\,\ interlace-mode\=\(string\)progressive\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ colorimetry\=\(string\)2:4:7:1"
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
Execution ended after 0:00:00.092171955
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Any idea where the problem might be?

Anyone with a hint at least?

I am pasting here the debug log of the problem. My analysis for now shows that the problem happens after buffer 0 is queued for a second time (it is queued initially with 3 more buffers and is the 1st one to be dequeued and processed)

0:00:00.554325662  3229    0xe1030 DEBUG          basetransform gstbasetransform.c:1795:default_copy_metadata:<nvvconv0> copying metadata
0:00:00.554415452  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:443:gst_buffer_copy_into: copy 0x1186a0 to 0xf4a3a158, offset 0-4147200/4147200
0:00:00.554551024  3229    0xe1030 DEBUG          basetransform gstbasetransform.c:2157:default_generate_output:<nvvconv0> using allocated buffer in 0x1186a0, out 0xf4a3a158
0:00:00.554659095  3229    0xe1030 DEBUG          basetransform gstbasetransform.c:2177:default_generate_output:<nvvconv0> doing non-inplace transform
0:00:00.554784250  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:1649:gst_buffer_map_range: buffer 0x1186a0, idx 0, length -1, flags 0001
0:00:00.554909457  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:212:_get_merged_memory: buffer 0x1186a0, idx 0, length 1
0:00:00.555024820  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:1649:gst_buffer_map_range: buffer 0xf4a3a158, idx 0, length -1, flags 0002
0:00:00.555141694  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:212:_get_merged_memory: buffer 0xf4a3a158, idx 0, length 1
0:00:00.555256119  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:443:gst_buffer_copy_into: copy 0x1186a0 to 0xf4a3a158, offset 0-4147200/4147200
0:00:00.555393201  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:638:_gst_buffer_dispose: release 0x1186a0 to pool 0xf4a060a0
0:00:00.555517887  3229    0xe1030 DEBUG         v4l2bufferpool gstv4l2bufferpool.c:1378:gst_v4l2_buffer_pool_release_buffer:<v4l2src0:pool:src> release buffer 0x1186a0
0:00:00.555632834  3229    0xe1030 LOG           v4l2bufferpool gstv4l2bufferpool.c:1078:gst_v4l2_buffer_pool_qbuf:<v4l2src0:pool:src> queuing buffer 0
0:00:00.555785644  3229    0xe1030 LOG            v4l2allocator gstv4l2allocator.c:1256:gst_v4l2_allocator_qbuf:<v4l2src0:pool:src:allocator> queued buffer 0 (flags 0x2003)
0:00:00.556006163  3229    0xe1030 DEBUG          basetransform gstbasetransform.c:2371:gst_base_transform_chain:<nvvconv0> we got return error
0:00:00.556109755  3229    0xe1030 LOG               GST_BUFFER gstbuffer.c:638:_gst_buffer_dispose: release 0xf4a3a158 to pool 0xf4a0b0c8
0:00:00.556230066  3229    0xe1030 DEBUG              nvvidconv gstnvvconv.c:686:gst_nv_filter_buffer_pool_release_buffer:<nvfilterbufferpool0> release_buffer
0:00:00.556410220  3229    0xe1030 LOG               bufferpool gstbufferpool.c:1218:default_release_buffer:<nvfilterbufferpool0> released buffer 0xf4a3a158 0
0:00:00.556515115  3229    0xe1030 DEBUG             GST_BUFFER gstbuffer.c:1303:gst_buffer_is_memory_range_writable: idx 0, length -1
0:00:00.556635426  3229    0xe1030 DEBUG         GST_SCHEDULING gstpad.c:4159:gst_pad_chain_data_unchecked:<nvvconv0:sink> called chainfunction &gst_base_transform_chain with buffer 0x1186a0, returned error
0:00:00.556768601  3229    0xe1030 DEBUG         GST_SCHEDULING gstpad.c:4159:gst_pad_chain_data_unchecked:<capsfilter0:sink> called chainfunction &gst_base_transform_chain with buffer 0x1186a0, returned error
0:00:00.556896100  3229    0xe1030 INFO                 basesrc gstbasesrc.c:2856:gst_base_src_loop:<v4l2src0> pausing after gst_pad_push() = error
0:00:00.557002140  3229    0xe1030 DEBUG                basesrc gstbasesrc.c:2899:gst_base_src_loop:<v4l2src0> pausing task, reason error
0:00:00.557104379  3229    0xe1030 DEBUG               GST_PADS gstpad.c:5994:gst_pad_pause_task:<v4l2src0:src> pause task
0:00:00.557210627  3229    0xe1030 DEBUG                   task gsttask.c:682:gst_task_set_state:<v4l2src0:src> Changing task 0x11e028 to state 2
0:00:00.557322553  3229    0xe1030 DEBUG              GST_EVENT gstevent.c:302:gst_event_new_custom: creating new event 0xf4a04108 eos 28174
0:00:00.557486770  3229    0xe1030 WARN                 basesrc gstbasesrc.c:2948:gst_base_src_loop:<v4l2src0> error: Internal data flow error.
0:00:00.557591508  3229    0xe1030 WARN                 basesrc gstbasesrc.c:2948:gst_base_src_loop:<v4l2src0> error: streaming task paused, reason error (-5)

Line 14 is where nvvidconv reports its problem.

Hello, amitev:
For nvvidconv, would you please try the following pipeline with h264 encoder:

gst-launch-1.0 -v videotestsrc ! ‘video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1’ ! nvvidconv ! ‘video/x-raw(memory:NVMM), format=(string)NV12’ ! omxh264enc ! qtmux ! filesink location=test_1080p.mp4 -e

This works well in my side.

br
ChenJian

Hi ChenJian,

Thanks for the suggestion. Indeed this pipeline works, I have also mentioned that in my 1st post. The problem comes when videotestsrc is replaced with v4l2src and with the rest of the pipeline exactly the same.

You can verify this with the vivi driver and the following pipeline:

gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! qtmux ! filesink location=test_1080p.mp4 -e

Hello, amitev
Can you tell me the SDK version you are using?

br
ChenJian

Hi,

I am using BSP version 23.2:

  • with its precompiled version of the rootfs;
  • kernel is derived from the official branch “l4t/l4t-r23.2” in order to implement my custom driver for ov5640 camera;
  • kernel compiled with “gcc-linaro-5.1-2015.08-x86_64_aarch64-linux-gnu”;
  • tried both supplied gstreamer (version 1.2.4) framework and also compiled version 1.7.91, behaviour is the same in both cases with the above mentioned error.

Hope this info is helpful :)

Hello, Amitev:
Got it. I need some time to sync the status, and will update it once it’s done.

BTW: I have only official TX1 on-board OV5693 and I will test that sensor and corresponding driver as well.

br
ChenJian

Thanks very much for testing it out :)

I guess the problem w/ ov5693 will be the same, so making it work with it, will enable usage with all v4l2 sources, even the virtual vivi one.

It is able to work on my camera board(mt9v117 640P/UYVY/30fps) with below command,

ubuntu@tegra-ubuntu:~$ gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! qtmux ! filesink location=test_640p.mp4  -v
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, framerate=(fraction)30/1, width=(int)640, height=(int)480, interlace-mode=(string)progressive, pixel-aspect-ratio=(fraction)1/1
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, codec_data=(buffer)01424015030100096742403295a0280f6401000468ce3c80
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:video_0: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, codec_data=(buffer)01424015030100096742403295a0280f6401000468ce3c80
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:02.378654984
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

Hey Guys!

I am dealing with the same issue amitev describes. I am taking in 4:2:2 YUV video (UYVY) 1080p30 video. I wrote a v4l2src driver for my custom source and it is receiving frames from my custom device without issue. As amitev mentioned, the video encoder only supports 4:2:0 (NV12) so a downsampling conversion of 4:2:2 → 4:2:0 must occur prior to the input feed & encoder negotiating.

I am able to use the videoconvert plugin and my full pipeline works. For example:

gst-launch-1.0  v4l2src device=/dev/video1 ! "video/x-raw, width=1920, height=1080, format=(string)UYVY, framerate=(fraction)30/1" ! videoconvert ! omxh264enc ! 'video/x-h264, stream-format=(string)byte-stream' ! filesink location=/opt/out.264

negotiates and generates encoded data. Unfortunately I believe the SW implementation of videoconvert is being a bottleneck and is not able to downscale the frames quick enough. Hence the resulting images are very low fps. Hence i want to use the nvvidconv which im sure will be faster.

The following pipeline using the videotestsrc source works fine:

gst-launch-1.0 -v videotestsrc ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! fakesink sync=true

But when i switch to my video input source via the following:

gst-launch-1.0 -v v4l2src device="/dev/video1" ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! fakesink sync=true

I get an error of:

0:00:02.515963977  3116   0x14bec0 INFO          v4l2bufferpool gstv4l2bufferpool.c:565:gst_v4l2_buffer_pool_set_config:<v4l2src0:pool:src> reducing maximum buffers to 32
0:00:02.548254456  3116   0x14bec0 WARN          v4l2bufferpool gstv4l2bufferpool.c:748:gst_v4l2_buffer_pool_start:<v4l2src0:pool:src> Uncertain or not enough buffers, enabling copy threshold
0:00:02.708747678  3116   0x14bec0 INFO                 v4l2src gstv4l2src.c:810:gst_v4l2src_create:<v4l2src0> sync to 0:00:00.033333333 out ts 0:00:01.787443478
0:00:02.708831893  3116   0x14bec0 INFO               GST_EVENT gstevent.c:760:gst_event_new_segment: creating segment event time segment start=0:00:00.000000000, offset=0:00:00.000000000, stop=99:99:99.999999999, rate=1.000000, applied_rate=1.000000, flags=0x00, time=0:00:00.000000000, base=0:00:00.000000000, position 0:00:00.000000000, duration 99:99:99.999999999
0:00:02.708950898  3116   0x14bec0 INFO                 basesrc gstbasesrc.c:2838:gst_base_src_loop:<v4l2src0> marking pending DISCONT
0:00:02.709596597  3116   0x14bec0 INFO                 basesrc gstbasesrc.c:2851:gst_base_src_loop:<v4l2src0> pausing after gst_pad_push() = error
0:00:02.709700134  3116   0x14bec0 WARN                 basesrc gstbasesrc.c:2943:gst_base_src_loop:<v4l2src0> error: Internal data flow error.
0:00:02.709755027  3116   0x14bec0 WARN                 basesrc gstbasesrc.c:2943:gst_base_src_loop:<v4l2src0> error: streaming task paused, reason error (-5)
0:00:02.709827888  3116   0x14bec0 INFO        GST_ERROR_SYSTEM [b]gstelement.c:1837:gst_element_message_full:<v4l2src0> posting message: Internal data flow error.
0:00:02.709940123  3116   0x14bec0 INFO        GST_ERROR_SYSTEM gstelement.c:1860:gst_element_message_full:<v4l2src0> posted error message: Internal data flow error.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2943): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)[/b]
Execution ended after 0:00:01.788122117
Setting pipeline to PAUSED ...

after about 1 second.

My guess is my video data is going into the nvvidconv plugin, but nvvidconv is not generating data out. After a short period of time, nvvidconv no longer grabs frames from v4l2src and an overflow error is thrown.

I do not know why the videotestsrc data would work but my data would not. Is there any way to get more information from nvvidconv as to why it isn’t converting?

Let me know if you need any other information. Thanks!

I just looked at the data coming out of my nvvidconv i previously said worked:

gst-launch-1.0 -v videotestsrc ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1080, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! fakesink sync=true

and it unfortunately looks like a majority of it is all 0x00s (i pumped it to a filesrc). Considering the other issues I am having I am wording if my nvvidconv isn’t working at all regardless of input.

I am using a custom kernel so i do not know if that is a factor.

I tested this pipeline with network stream even, and it is working - i.e. I had the colorful picture with static at the lower right corner. I had my custom kernel compiled and also a compiled 1.7.91 gstreamer.

Hi again,

I am very surprised to read nvvidconv is actually working with a v4l2 source as nVConan claims.

From what I see there are just 2 different things in your setup from what I tested:

  • using mt9v117 camera
  • using resolution of 640x480

As I do not have this camera, I tried the vivi driver with 640x480 resolution, but I get the same error.

sudo modprobe vivi
gst-launch-1.0 -v v4l2src device="/dev/video0" ! 'video/x-raw, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)30/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! fakesink  -v

@nVConan, can you please test with vivi driver (has to be enabled in kernel config), because it is the driver we all can use to have the same setup.

How about below command?

gst-launch-1.0 v4l2src device=/dev/video0 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! qtmux ! filesink location=test.mp4 -v

Hi there,

Sorry for not bearing good news to the thread - still same error. What I see now is that without any format specification, v4l2 source just uses defaults, but format is still UYVY. Below is verbose output when trying to open vivi source (situation is the same with camera, as I said before) :

Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLa
tencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)320\,\ height\=\(int\)200\,\ pixel-aspe
ct-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)interleaved\,\ colorimetry\=\(string\)bt601\,\ framerate\=\(fraction\)100/1"
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = "video/x-raw\(memory:NVMM\)\,\ width\=\(int\)320\,\ height\=\(int\)200\,\ pixel-aspect-ratio\=\
(fraction\)1/1\,\ interlace-mode\=\(string\)interleaved\,\ framerate\=\(fraction\)100/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = "video/x-raw\(memory:NVMM\)\,\ width\=\(int\)320\,\ height\=\(int\)200\,\ pixel-aspect-ra
tio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)interleaved\,\ framerate\=\(fraction\)100/1\,\ format\=\(string\)NV12"
Framerate set to : 100 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = "video/x-raw\(memory:NVMM\)\,\ width\=\(int\)320\,\ height\=\(int\
)200\,\ pixel-aspect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)interleaved\,\ framerate\=\(fraction\)100/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = "video/x-raw\(memory:NVMM\)\,\ width\=\(int\)320\,\ height\=\(int\)200\,\ pixel-aspect-r
atio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)interleaved\,\ framerate\=\(fraction\)100/1\,\ format\=\(string\)NV12"
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = "video/x-raw\,\ format\=\(string\)UYVY\,\ width\=\(int\)320\,\ height\=\(int\)200\,\ pixel-asp
ect-ratio\=\(fraction\)1/1\,\ interlace-mode\=\(string\)interleaved\,\ colorimetry\=\(string\)bt601\,\ framerate\=\(fraction\)100/1"
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason error (-5)
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = "video/quicktime\,\ variant\=\(string\)apple"
Execution ended after 0:00:00.007894323
Setting pipeline to PAUSED ...
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = "video/quicktime\,\ variant\=\(string\)apple"
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...

@nVConan: Once again, please, try to confirm that you can start pipeline with vivi driver opened.

Well, I seem to be mistaken in my last post, sorry for that :(

Actually, the pipeline suggested by nVConan DOES work (with stock 1.2.4 gstreamer, as I tested it on a clean reflash), and produces the following output:

ubuntu@tegra-ubuntu:~$ gst-launch-1.0 v4l2src device=/dev/video0 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! qtmux ! filesink location=file2.mp4 -e -v
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1
Framerate set to : 1000 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
NvH264MSEncInit: Frame rate overridden to 30 (frame_rate 1000.000000)
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)1000/1
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)1000/1, codec_data=(buffer)014240150301000b6742403295a01e0107e74001000468ce3c80
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:video_0: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)1000/1, codec_data=(buffer)014240150301000b6742403295a01e0107e74001000468ce3c80
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
^Chandling interrupt.
Interrupt: Stopping pipeline ...
EOS on shutdown enabled -- Forcing EOS on the pipeline
Waiting for EOS...

This I tested with the vivi driver.

So I reach the following conclusions:

  • nvvidconv doesn’t work with custom compilation of gstreamer. This was my mistake in the last post, and I was hinted by another topic here, that just says this.
  • Somehow nvvidconv forces some very strange resolution on the V4L2 device. In this case it is 1920x500. And vivi max resolution is set to be 1920x1200
    It is interesting to note here that when I modify the vivi driver to support max resolution of 3840x2160 and start the pipeline - I have nvvidconv forcing a 3840x980. Does these numbers ring a bell to someone?
  • Changing the resolution on the input side of nvvidvonv results in a pipeline that doesn’t start.
  • Changing the resolution on the output side, however, is possible. I tried this one:
gst-launch-1.0 v4l2src device=/dev/video0 ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12, width=(int)640, height=(int)480' ! omxh264enc ! qtmux ! filesink location=file2.mp4 -e
  • Happiness doesn’t exist (this is only for the record).

Comments, especially on the strange numbers, are more than welcome.

Hey!

I’ve been off my project for a few weeks now. Did you ever get any further in getting the nvvidconv to negotiate properly?

If I concluded it right, there are still two side topics left, right?

  1. why does it stream in 1920x500 instead of 1920x1200?
  2. why can't we set input format when involving nvvicconv?

Probably the strange cropping upon height is coming from v4l2src than nvvidconv. As when I tried below three commands , I got failure on the first and second one while success on the last one.

export DISPLAY=:0
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1200' ! xvimagesink  -v
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1200' ! fakesink  -v
gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500' ! xvimagesink  -v

No matter what gst commands being used, I was not able to get 1920x1200 output streaming.
Then based on the above fault, I tried below command and it was working,

ubuntu@tegra-ubuntu:~$ ./yavta -l /dev/video0 
Device /dev/video0 opened.
Device `vivi' on `platform:vivi-000' is a video capture device.
--- User Controls (class 0x00980001) ---
control 0x00980900 `Brightness' min 0 max 255 step 1 default 127 current 127.
control 0x00980901 `Contrast' min 0 max 255 step 1 default 16 current 16.
control 0x00980902 `Saturation' min 0 max 255 step 1 default 127 current 127.
control 0x00980903 `Hue' min -128 max 127 step 1 default 0 current 0.
control 0x00980905 `Volume' min 0 max 255 step 1 default 200 current 200.
control 0x00980912 `Gain, Automatic' min 0 max 1 step 1 default 1 current 1.
control 0x00980913 `Gain' min 0 max 255 step 1 default 100 current 30.
control 0x00980929 `Alpha Component' min 0 max 255 step 1 default 0 current 0.
unable to get control 0x0098f900: Permission denied (13).
control 0x0098f900 `Button' min 0 max 0 step 0 default 0 current n/a.
control 0x0098f901 `Boolean' min 0 max 1 step 1 default 1 current 1.
control 0x0098f902 `Integer 32 Bits' min -2147483648 max 2147483647 step 1 default 0 current 0.
control 0x0098f903 `Integer 64 Bits' min 0 max 0 step 0 default 0 current 0.
control 0x0098f904 `Menu' min 1 max 4 step 1 default 3 current 3.
  1: Menu Item 1
  3: Menu Item 3
  4: Menu Item 4
unable to get control 0x0098f905: No space left on device (28).
control 0x0098f905 `String' min 2 max 4 step 1 default 0 current n/a.
control 0x0098f906 `Bitmask' min 0 max -2143281136 step 0 default -2147475456 current -2147475456.
control 0x0098f907 `Integer menu' min 1 max 8 step 1 default 4 current 4.
  2: 2
  3: 3
  4: 5
  5: 8
  6: 13
  7: 21
  8: 42
16 controls found.
Video format: UYVY (59565955) 1920x500 (stride 3840) buffer size 1920000
ubuntu@tegra-ubuntu:~$ gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500' ! nvvidconv ! 'video/x-raw(memory:NVMM), width=(int)640, height=(int)480, framerate=(fraction)30/1, format=(string)NV12' ! nvoverlaysink -v
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingPipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)30/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)72/25, interlace-mode=(string)mixed, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)72/25, interlace-mode=(string)mixed, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/GstNvOverlaySink-nvoverlaysink:nvoverlaysink-nvoverlaysink0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)72/25, interlace-mode=(string)mixed, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)72/25, interlace-mode=(string)mixed, framerate=(fraction)30/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)30/1
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:05.622708494
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu@tegra-ubuntu:~$

I was able to get my v4l2src negotiating and streaming at 1920x1080 by setting the io-mode of v4l2src to ‘rw’.

After doing that, nvvidconv negotiates fine with v4l2src and whatever caps I give it and then is able to send the data to the omxh264enc encoder element without any issues.

Are there any performance hits to using ‘rw’ instead of ‘auto’ (im assuming auto would then select dmabuff)?

Any ideas as to why ‘rw’ would work but not the others?

Hi x1tester62,

Be glad to hear that you have found some way to work it out.
But I am not very clear regarding which issue you were addressing, and could you please post your test commands here?
Some commands I used(based on vivi driver),

ubuntu@tegra-ubuntu:~$ <b>gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1200' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! qtmux ! filesink location=test.mp4 -v</b>
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and Mjstreaming
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.158930014
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu@tegra-ubuntu:~$ <b>gst-launch-1.0 v4l2src device=/dev/video0 io-mode=rw ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)1200' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! qtmux ! filesink location=test.mp4 -v</b>
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and Mjstreaming
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Internal data flow error.
Additional debug info:
gstbasesrc.c(2865): gst_base_src_loop (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0:
streaming task paused, reason not-negotiated (-4)
Execution ended after 0:00:00.159343914
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
ubuntu@tegra-ubuntu:~$ <b>gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)NV12' ! omxh264enc ! qtmux ! filesink location=test.mp4 -v</b>
Setting pipeline to PAUSED ...
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingNfeng***nv_drm_connectors_count = 2
Nfeng***nv_drm_planes_count = 4
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1, format=(string)NV12
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)UYVY, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)mixed, framerate=(fraction)1000/1
Framerate set to : 1000 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4 
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
NvH264MSEncInit: Frame rate overridden to 30 (frame_rate 1000.000000)
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)1000/1
/GstPipeline:pipeline0/GstOMXH264Enc-omxh264enc:omxh264enc-omxh264enc0.GstPad:src: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)1000/1, codec_data=(buffer)014240150301000b6742403295a01e0107e74001000468ce3c80
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:video_0: caps = video/x-h264, alignment=(string)au, stream-format=(string)avc, width=(int)1920, height=(int)500, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)1000/1, codec_data=(buffer)014240150301000b6742403295a01e0107e74001000468ce3c80
/GstPipeline:pipeline0/GstQTMux:qtmux0.GstPad:src: caps = video/quicktime, variant=(string)apple
/GstPipeline:pipeline0/GstFileSink:filesink0.GstPad:sink: caps = video/quicktime, variant=(string)apple
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:01.711362568
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...