GStreamer freeze when using qtmux and NVIDIA-accelerated h264/h265 encoding

I am writing a GStreamer pipeline using the C API, and I have encountered a bug when saving a video file. I had previously tested the following pipeline successfully using gst-launch-1.0:

gst-launch-1.0 nvarguscamerasrc sensor-id=0 ! 'video/x-raw(memory:NVMM), width=(int)1280, height=(int)720, framerate=(fraction)30/1' ! nvvidconv ! nvv4l2h265enc ! h265parse ! qtmux ! filesink  location=test_h265.mp4 -e

However, when replicating this using the C API, the output MP4 file is always only 36 bytes. When I looked closer and added in a tee to see whether the stream was running after launching, it turned out that the stream was freezing after the first frame. Running with the GStreamer debug warnings on, it gives the following:

0:00:00.667231780 24333   0x7f780028f0 WARN                   qtmux gstqtmux.c:4553:gst_qt_mux_add_buffer:<qtmux0> error: Buffer has no PTS.
0:00:00.731910937 24333   0x55c2b08230 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<nvarguscamerasrc0> error: Internal data stream error.
0:00:00.731998329 24333   0x55c2b08230 WARN                 basesrc gstbasesrc.c:3055:gst_base_src_loop:<nvarguscamerasrc0> error: streaming stopped, reason error (-5)
0:00:00.732112728 24333   0x55c2b08230 WARN                   queue gstqueue.c:988:gst_queue_handle_sink_event:<logging_queue> error: Internal data stream error.
0:00:00.732146104 24333   0x55c2b08230 WARN                   queue gstqueue.c:988:gst_queue_handle_sink_event:<logging_queue> error: streaming stopped, reason error (-5)

Note that I tried both the nvv4l2h265enc and nvv4l2h264enc separately, and both froze. My guess is that there is some problem when negotiating between the h265parse element and qtmux.

One other thing to note is that when I swapped the qtmux element with avimux, and set the filesink to save an .avi file, this works using the C API. Thus, it appears to be a problem with only the qtmux element, but I am not sure how to further debug.

While using .avi files and getting around the issue above, I have now run into another problem. When closing down the pipeline using the C API, GStreamer gets stuck on changing the state of the nvv4l2h265enc. The terminal is completely stuck, and I cannot get out of the program. There seems to be some difference in how the NVIDIA-accelerated components close down when using the command line, as it does not freeze there. I am not sure if it is related to this issue: https://devtalk.nvidia.com/default/topic/1057349/jetson-nano/accelerated-gstreamer-components-gstreamermm-and-closing-the-pipeline-/.

However, I need to use GStreamer to interface with VisionWorks, so I was wondering if there is any fix to this.

Hi,
After replacing pipeline in the following sample:
https://devtalk.nvidia.com/default/topic/1015571/jetson-tx2/what-is-maximum-video-encoding-resolution-in-pixels-/post/5253760/#5253760
with

launch_stream
<< "appsrc name=mysource ! "
<< "video/x-raw,width="<< w <<",height="<< h <<",framerate=1/1,format=I420 ! "
<< "nvvidconv ! video/x-raw(memory:NVMM),format=NV12 ! "
<< "nvv4l2h264enc ! h264parse ! qtmux ! "
<< "filesink location=a.mp4 ";

The sample runs well without hitting the freeze. Please refer to it.

Since I am using nvarguscamerasrc and not appsrc, I need to use the format caps NV12, not I420. However, when using the above with only that difference, the a.mp4 file is still only 36 bytes. Is there any fix to this problem?

Hi,

Please try

launch_stream
<< "nvarguscamerasrc ! "
<< "video/x-raw(memory:NVMM),width=1920,height=1080,format=NV12 ! "
<< "nvv4l2h264enc ! h264parse ! qtmux ! "
<< "filesink location=a.mp4 ";

Thanks, I have a minimal example working with the the H.264 encoding and avimux. However, the qtmux still fails to produce any readable video. I now see a larger file size (a few MB), but there is no playable stream. I am using JetPack 4.2 on a TX2 if that makes any difference.

Hi,
Please refer to attached code. Waiting for EOS is required.

#include <cstdlib>
#include <cstring>
#include <sstream>
#include <gst/gst.h>

using namespace std;

#define USE(x) ((void)(x))

static GstPipeline *gst_pipeline = nullptr;
static string launch_string;

GstClockTime usec = 1000000;
static int w = 1920;
static int h = 1080;

int main(int argc, char** argv) {
    USE(argc);
    USE(argv);

    gst_init (&argc, &argv);

    GMainLoop *main_loop;
    main_loop = g_main_loop_new (NULL, FALSE);
    ostringstream launch_stream;

    launch_stream
    << "nvarguscamerasrc name=mysource ! "
    << "video/x-raw(memory:NVMM),width="<< w <<",height="<< h <<",framerate=30/1,format=NV12 ! "
    << "nvv4l2h264enc ! h264parse ! qtmux ! "
    << "filesink location=a.mp4 ";

    launch_string = launch_stream.str();

    g_print("Using launch string: %s\n", launch_string.c_str());

    GError *error = nullptr;
    gst_pipeline  = (GstPipeline*) gst_parse_launch(launch_string.c_str(), &error);

    if (gst_pipeline == nullptr) {
        g_print( "Failed to parse launch: %s\n", error->message);
        return -1;
    }
    if(error) g_error_free(error);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING); 

    g_usleep(30*usec);

    GstElement* src = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysource");
    gst_element_send_event (src, gst_event_new_eos ());
    // Wait for EOS message
    GstMessage *msg;
    GstBus *bus = gst_pipeline_get_bus(GST_PIPELINE(gst_pipeline));
    msg = gst_bus_poll(bus, GST_MESSAGE_EOS, GST_CLOCK_TIME_NONE);
    gst_message_unref(msg);
    gst_object_unref(bus);

    gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_NULL);
    gst_object_unref(GST_OBJECT(gst_pipeline));
    g_main_loop_unref(main_loop);

    g_print("going to exit \n");
    return 0;
}

Thank you for your help. That was the issue. I had been modifying the nvgstcamera_capture sample to simultaneously log and insert data into VisionWorks, but since the sample is not intended for saving the data, I needed to add in the EOS signal. Thanks again.