Gstreamer decode live video stream with the delay difference between gst-launch-1.0 command and appsink callback

Hello everyone!
I have encountered a confusing problem on jetson-TX1.It works well(real-time decoding 25fps),when i run gst-launch-1.0 command on terminal.

nvidia@tegra-ubuntu:~$ gst-launch-1.0 rtspsrc location=rtsp://admin:admin12345@192.168.0.64:554/Streaming/Channels/101?transportmode=unicastprofile=Profile_1 protocols=tcp latency=0 ! decodebin ! videoconvert ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://admin:admin12345@192.168.0.64:554/Streaming/Channels/101?transportmode=unicastprofile=Profile_1
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (request) SETUP stream 1
Progress: (open) Opened Stream
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingNvMMLiteOpen : Block : BlockType = 261 
TVMR: NvMMLiteTVMRDecBlockOpen: 7818: NvMMLiteBlockOpen 
NvMMLiteBlockCreate : Block : BlockType = 261 
TVMR: cbBeginSequence: 1190: BeginSequence  1920x1088, bVPR = 0
TVMR: LowCorner Frequency = 180000 
TVMR: cbBeginSequence: 1583: DecodeBuffers = 5, pnvsi->eCodec = 4, codec = 0 
TVMR: cbBeginSequence: 1654: Display Resolution : (1920x1080) 
TVMR: cbBeginSequence: 1655: Display Aspect Ratio : (1920x1080) 
TVMR: cbBeginSequence: 1697: ColorFormat : 5 
TVMR: cbBeginSequence:1702 ColorSpace = NvColorSpace_YCbCr709_ER
TVMR: cbBeginSequence: 1839: SurfaceLayout = 3
TVMR: cbBeginSequence: 1936: NumOfSurfaces = 9, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1, BitDepthForSurface = 8 LumaBitDepth = 8, ChromaBitDepth = 8, ChromaFormat = 5
TVMR: cbBeginSequence: 1938: BeginSequence  ColorPrimaries = 1, TransferCharacteristics = 1, MatrixCoefficients = 1
Allocating new output: 1920x1088 (x 9), ThumbnailMode = 0
TVMR: FrameRate = 25 
TVMR: NVDEC LowCorner Freq = (150000 * 1024) 
---> TVMR: Video-conferencing detected !!!!!!!!!
TVMR: FrameRate = 25.000000 
TVMR: FrameRate = 25.000000 
TVMR: FrameRate = 25.000000 
TVMR: FrameRate = 25.000000 
TVMR: FrameRate = 25.000000 
TVMR: FrameRate = 25.000000 
TVMR: FrameRate = 25.000000 
TVMR: FrameRate = 25.000000 
^Chandling interrupt.
Interrupt: Stopping pipeline ...
Execution ended after 0:00:39.169351505
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
TVMR: cbDisplayPicture: 3889: Retunred NULL Frame Buffer 
TVMR: TVMRFrameStatusReporting: 6266: Closing TVMR Frame Status Thread -------------
TVMR: TVMRVPRFloorSizeSettingThread: 6084: Closing TVMRVPRFloorSizeSettingThread -------------
TVMR: TVMRFrameDelivery: 6116: Closing TVMR Frame Delivery Thread -------------
TVMR: NvMMLiteTVMRDecBlockClose: 8018: Done 
Setting pipeline to NULL ...
Freeing pipeline ...

But,When i tested the follow code which used appsink callback and the result is that video is not real-time and the delaying is accumulating over time.The result is unacceptable in my real-time project.This is the part code.(gstIPCamera.cpp and gst-camera.cpp)

/*
 * Author:Chen
 * Date:2017/05/17
 */

#include "gstIPCamera.h"

#include <gst/gst.h>
#include <gst/app/gstappsink.h>

#include <stdio.h>
#include <sstream> 
#include <unistd.h>
#include <string.h>
#include <stdlib.h>

#include <QMutex>
#include <QWaitCondition>

#include <opencv2/opencv.hpp>

using namespace cv;

// constructor
gstIPCamera::gstIPCamera()
{	
	mAppSink    = NULL;
	mBus        = NULL;
	mPipeline   = NULL;	

	
	mWaitEvent  = new QWaitCondition();
	mWaitMutex  = new QMutex();
	mRingMutex  = new QMutex();
	
	mLatestRingbuffer = 0;
	mLatestRetrieved  = false;
	
	for( uint32_t n=0; n < NUM_RINGBUFFERS; n++ )
	{
	    mRingbufferCPU[n] = NULL;
	}
}


// destructor	
gstIPCamera::~gstIPCamera()
{
	if(NULL != mRingbufferCPU[0]){
	    for( uint32_t n=0; n < NUM_RINGBUFFERS; n++ )
	    {
		free(mRingbufferCPU[n]);
		mRingbufferCPU[n] = NULL;	
	    }	
	}
}

// onEOS
void gstIPCamera::onEOS(_GstAppSink* sink, void* user_data)
{
	printf( "gstreamer decoder onEOS\n");
}


// onPreroll
GstFlowReturn gstIPCamera::onPreroll(_GstAppSink* sink, void* user_data)
{
	printf( "gstreamer decoder onPreroll\n");
	return GST_FLOW_OK;
}

// onBuffer
GstFlowReturn gstIPCamera::onBuffer(_GstAppSink* sink, void* user_data)
{
	printf( "gstreamer decoder onBuffer\n");
	
	if( !user_data )
		return GST_FLOW_OK;
		
	gstIPCamera* dec = (gstIPCamera*)user_data;
	
	dec->checkBuffer();
	//dec->checkMsgBus();
	return GST_FLOW_OK;
}
	

// Capture
bool gstIPCamera::Capture( void ** cpu, unsigned long timeout )
{
	mWaitMutex->lock();
    	const bool wait_result = mWaitEvent->wait(mWaitMutex, timeout);
    	mWaitMutex->unlock();
	
	
	if( !wait_result )
	{
		printf("Failed to wait result!\n");	
		return false;
	}
	
	mRingMutex->lock();	
	const uint32_t latest = mLatestRingbuffer;
	const bool retrieved = mLatestRetrieved;
	mLatestRetrieved = true;
	mRingMutex->unlock();
	
	
		
	// skip if it was already retrieved
	if( retrieved )
		return false;

	
	if( cpu != NULL )
		*cpu = mRingbufferCPU[latest];

	return true;	
}


#define release_return { gst_sample_unref(gstSample); return; }


// checkBuffer
void gstIPCamera::checkBuffer()
{
       	printf( "Get IP camera frame data!\n");
	if( !mAppSink )
		return;

	// block waiting for the buffer
	GstSample* gstSample = gst_app_sink_pull_sample(mAppSink);
	
	if( !gstSample )
	{
		printf( "gstreamer camera -- gst_app_sink_pull_sample() returned NULL...\n");
		return;
	}
	
	GstBuffer* gstBuffer = gst_sample_get_buffer(gstSample);
	
	if( !gstBuffer )
	{
		printf( "gstreamer camera -- gst_sample_get_buffer() returned NULL...\n");
		return;
	}
	
	// retrieve
	GstMapInfo map; 

	if(	!gst_buffer_map(gstBuffer, &map, GST_MAP_READ) ) 
	{
		printf( "gstreamer camera -- gst_buffer_map() failed...\n");
		return;
	}
	
	//gst_util_dump_mem(map.data, map.size); 

	void* gstData = map.data; //GST_BUFFER_DATA(gstBuffer);
	const uint32_t gstSize = map.size; //GST_BUFFER_SIZE(gstBuffer);
	
	if( !gstData )
	{
		printf( "gstreamer camera -- gst_buffer had NULL data pointer...\n");
		release_return;
	}
	
	
	
	// retrieve caps
	GstCaps* gstCaps = gst_sample_get_caps(gstSample);
	
	if( !gstCaps )
	{
		printf( "gstreamer camera -- gst_buffer had NULL caps...\n");
		release_return;
	}
	
	GstStructure* gstCapsStruct = gst_caps_get_structure(gstCaps, 0);
	
	if( !gstCapsStruct )
	{
		printf( "gstreamer camera -- gst_caps had NULL structure...\n");
		release_return;
	}
	
	// get width & height of the buffer
	int width  = 0;
	int height = 0;
	
	if( !gst_structure_get_int(gstCapsStruct, "width", &width) ||
		!gst_structure_get_int(gstCapsStruct, "height", &height) )
	{
		printf( "gstreamer camera -- gst_caps missing width/height...\n");
		release_return;
	}
	
	if( width < 1 || height < 1 )
		release_return;
	
	mWidth  = width;
	mHeight = height;
	mDepth  = (gstSize * 8) / (width * height);
	mSize   = gstSize;
	mPitch  = gstSize / height;
	
	cv::Mat frame(Size(width,height),CV_8UC3,(char*)gstData,Mat::AUTO_STEP);
	if(!frame.empty())
	{
		cv::imshow("frame111",frame);
		cv::waitKey(1);
	}


	printf( "gstreamer camera recieved %ix%i frame (%u bytes, %u bpp)\n", width, height, gstSize, mDepth);

	//make sure mFrameBuffer is allocated
	if( NULL == mRingbufferCPU[0])
	{
		for( uint32_t n=0; n < NUM_RINGBUFFERS; ++n ){
		    if(!(mRingbufferCPU[n] = (void*)malloc(gstSize)))
			printf( "gstreamer camera -- failed to allocate mRingbufferCPU %u  (size=%u)\n", n, gstSize);
		}					
		printf( "gstreamer camera -- allocated %u mFrameBuffer, %u bytes each\n",NUM_RINGBUFFERS, gstSize);    
	}

	// copy to next ringbuffer
	const uint32_t nextRingbuffer = (mLatestRingbuffer + 1) % NUM_RINGBUFFERS;		
	
	printf( "gstreamer camera -- using ringbuffer #%u for next frame\n", nextRingbuffer);
	memcpy(mRingbufferCPU[nextRingbuffer], gstData, gstSize);
	gst_buffer_unmap(gstBuffer, &map); 
	//gst_buffer_unref(gstBuffer);
	gst_sample_unref(gstSample);
	
	usleep(100000);
	// update and signal sleeping threads
	mRingMutex->lock();
	mLatestRingbuffer = nextRingbuffer;
	mLatestRetrieved  = false;
	mRingMutex->unlock();
	mWaitEvent->wakeAll();
}



// buildLaunchStr
bool gstIPCamera::buildLaunchStr()
{
	std::ostringstream ss;	
	//open ip camera
        char ipRtsp[260];
        if(mCamType == 0)
        {
           	sprintf(ipRtsp,"rtspsrc location=rtsp://%s:%s@%s:%s/Streaming/Channels/101?transportmode=unicast&profile=Profile_1 !",mUserName,mPassword,mIP,mPorts);
        }
        else if(mCamType == 1)
        {
           	sprintf(ipRtsp,"rtspsrc location=rtsp://%s:%s@%s:%s/cam/realmonitor?channel=1&subtype=0&unicast=true !",mIP,mPorts,mUserName,mPassword);
        }
       else
       {
           	sprintf(ipRtsp,"rtspsrc location=rtsp://%s:%s/user=admin_password=tlJwpbo6_channel=1_stream=0.sdp?real_stream protocols=tcp latency=0 !",mIP,mPorts);
       }
        ss <<  ipRtsp;
	ss << " rtpjitterbuffer ! rtph264depay ! h264parse ! omxh264dec !";
	ss << " nvvidconv !";
	ss << " videoconvert ! ";	
	ss << " appsink name=sink caps=video/x-raw,format=BGR,fFrameRate=25/1 sync=false";
	
	//ss <<  " decodebin ! nvvidconv !";
        //ss <<  " videoconvert !";
        //ss << " appsink name=sink caps=video/x-raw,width=1920,height=1080,format=BGR,fFrameRate=25/1 sync=false";



	mLaunchStr = ss.str();

	printf( "gstreamer decoder pipeline string:\n");
	printf("%s\n", mLaunchStr.c_str());
	return true;
}

// init
bool gstIPCamera::init(char *ip, char *userName,char *password, char *port,int camType)
{
	int argc = 0;
	//char* argv[] = { "none" };

	if( !gst_init_check(&argc, NULL, NULL) )
	{
		printf( "failed to initialize gstreamer library with gst_init()\n");
		return false;
	}

	mIP = ip;
	mUserName = userName;
	mPassword = password;
	mPorts = port;
	mCamType = camType;
	

	GError* err = NULL;
	
	// build pipeline string
	if( !buildLaunchStr() )
	{
		printf( "gstreamer decoder failed to build pipeline string\n");
		return false;
	}

	// launch pipeline
	mPipeline = gst_parse_launch(mLaunchStr.c_str(), &err);

	if( err != NULL )
	{
		printf( "gstreamer decoder failed to create pipeline\n");
		printf( "   (%s)\n", err->message);
		g_error_free(err);
		return false;
	}

	GstPipeline* pipeline = GST_PIPELINE(mPipeline);

	if( !pipeline )
	{
		printf( "gstreamer failed to cast GstElement into GstPipeline\n");
		return false;
	}	

	// retrieve pipeline bus
	/*GstBus**/ mBus = gst_pipeline_get_bus(pipeline);

	if( !mBus )
	{
		printf( "gstreamer failed to retrieve GstBus from pipeline\n");
		return false;
	}

	// add watch for messages (disabled when we poll the bus ourselves, instead of gmainloop)
	//gst_bus_add_watch(mBus, (GstBusFunc)gst_message_print, NULL);

	// get the appsrc
	GstElement* appsinkElement = gst_bin_get_by_name(GST_BIN(pipeline), "sink");
	GstAppSink* appsink = GST_APP_SINK(appsinkElement);

	if( !appsinkElement || !appsink)
	{
		printf( "gstreamer failed to retrieve AppSink element from pipeline\n");
		return false;
	}
	
	mAppSink = appsink;
	
	// setup callbacks
	GstAppSinkCallbacks cb;
	memset(&cb, 0, sizeof(GstAppSinkCallbacks));
	
	cb.eos         = onEOS;
	cb.new_preroll = onPreroll;
	cb.new_sample  = onBuffer;
	
	gst_app_sink_set_callbacks(mAppSink, &cb, (void*)this, NULL);
	
	return true;
}


// Open
bool gstIPCamera::Open(char *ip, char *userName,char *password, char *port,int camType)
{
	//init
	init(ip, userName,password, port,camType);
	// transition pipline to STATE_PLAYING
	printf( "gstreamer transitioning pipeline to GST_STATE_PLAYING\n");
	
	const GstStateChangeReturn result = gst_element_set_state(mPipeline, GST_STATE_PLAYING);

	if( result == GST_STATE_CHANGE_ASYNC )
	{
#if 0
		GstMessage* asyncMsg = gst_bus_timed_pop_filtered(mBus, 5 * GST_SECOND, 
    	 					      (GstMessageType)(GST_MESSAGE_ASYNC_DONE|GST_MESSAGE_ERROR)); 

		if( asyncMsg != NULL )
		{
			gst_message_print(mBus, asyncMsg, this);
			gst_message_unref(asyncMsg);
		}
		else
			printf( "gstreamer NULL message after transitioning pipeline to PLAYING...\n");
#endif
	}
	else if( result != GST_STATE_CHANGE_SUCCESS )
	{
		printf( "gstreamer failed to set pipeline state to PLAYING (error %u)\n", result);
		return false;
	}

	//checkMsgBus();
	usleep(100*1000);
	//checkMsgBus();

	return true;
}
	

// Close
bool gstIPCamera::Close()
{
	// stop pipeline
	printf( "gstreamer transitioning pipeline to GST_STATE_NULL\n");

	const GstStateChangeReturn result = gst_element_set_state(mPipeline, GST_STATE_NULL);

	if( result != GST_STATE_CHANGE_SUCCESS )
		printf( "gstreamer failed to set pipeline state to PLAYING (error %u)\n", result);
	usleep(250*1000);
	return true;
}
/*
 * inference-decode
 */

#include "gstIPCamera.h"
#include <stdio.h>
#include <signal.h>
#include <unistd.h>
#include <opencv2/opencv.hpp>

using namespace cv;

bool signal_recieved = false;

void sig_handler(int signo)
{
	if( signo == SIGINT )
	{
		printf("received SIGINT\n");
		signal_recieved = true;
	}
}


int main( int argc, char** argv )
{
	printf("gst-ipcamera\n  args (%i):  ", argc);

	for( int i=0; i < argc; i++ )
		printf("%i [%s]  ", i, argv[i]);
		
	printf("\n");
	
		
	if( signal(SIGINT, sig_handler) == SIG_ERR )
		printf("\ncan't catch SIGINT\n");

	/*
	 * create the ipcamera device
	 */
	gstIPCamera* ipcamera = new gstIPCamera();

	/*
	 * start streaming
	 */
	if( !ipcamera->Open("192.168.0.64","admin","admin12345","554",0) )
	{
		printf("\ngst-ipcamera:  failed to open camera for streaming\n");
		return 0;
	}
	
	printf("\ngst-ipcamera:  successfully initialized video device\n");
	printf("    width:  %u\n", ipcamera->GetWidth());
	printf("   height:  %u\n", ipcamera->GetHeight());
	printf("    depth:  %u (bpp)\n", ipcamera->GetPixelDepth());


	
	while(!signal_recieved)
	{
		//printf("\ngst-ipcamera: start display frame!\n");
		void* img  = NULL;
		// get the latest frame
		if( !ipcamera->Capture(&img, 100) )
			printf("\ngst-ipcamera:  failed to capture frame\n");
		else
			printf("gst-ipcamera:  recieved new frame =0x%p\n", img);

		//convert from frame to Mat
		if(NULL != img){
		    cv::Mat frame(Size(ipcamera->GetWidth(),ipcamera->GetHeight()),CV_8UC3,(char*)img,Mat::AUTO_STEP);
		    if(!frame.empty()){
		     	cv::imshow("frame",frame);
			cv::waitKey(1);
		    }
		}
	}
	
	printf("\ngst-ipcamera:  un-initializing video device\n");
	
	
	/*
	 * shutdown the ip camera device
	 */
	if( ipcamera != NULL )
	{
		delete ipcamera;
		ipcamera = NULL;
	}
	
	printf("gst-ipcamera:  video device has been un-initialized.\n");
	printf("gst-ipcamera:  this concludes the test of the video device.\n");
	return 0;
}

The decode speed is only 10fps,and the result is unacceptable.
Could you give some suggestions?Thanks for you in advance.

Should be same as [url]https://devtalk.nvidia.com/default/topic/1010111/jetson-tx1/nvmm-memory/post/5158839/#5158839[/url]

Pleease try [url]https://devtalk.nvidia.com/default/topic/1010111/jetson-tx1/nvmm-memory/post/5160218/#5160218[/url]

Hi DaneLLL,
Thanks for your reply!I changed my code as your suggestions.

#include <gst/gst.h>
#include <gst/app/gstappsink.h>
#include <stdlib.h>
#include <stdio.h>

#include <iostream>
#include <sstream>

using namespace std;

// TODO: use synchronized deque
static GstPipeline *gst_pipeline = NULL;
static string launch_string; 
double startTime = 0;
int num = 0;

double msTime()
{
    struct timespec now_timespec;
    clock_gettime(CLOCK_MONOTONIC,&now_timespec);
    return ((double)now_timespec.tv_sec)*1000.0 + ((double)now_timespec.tv_nsec)*1.0e-6;
}

static void appsink_eos(GstAppSink *appsink, gpointer data)
{
    g_print ("Got preroll!\n");
    //return GST_FLOW_OK;
}

static GstFlowReturn new_buffer(GstAppSink *appsink, gpointer user_data)
{
    if(startTime == 0)
      startTime = msTime(); 
    num++;
    //printf("start time is %f\n",startTime);

    double spendTime= 0.0;
    GstSample *sample = NULL;
    g_signal_emit_by_name (appsink, "pull-sample", &sample,NULL);

    if (sample)
    {
        GstBuffer *buffer = NULL;
        GstCaps   *caps   = NULL;
        GstMapInfo map    = {0};

        caps = gst_sample_get_caps (sample);
        if (!caps)
        {
            printf("could not get snapshot format\n");
        }
        gst_caps_get_structure (caps, 0);
        buffer = gst_sample_get_buffer (sample);
        gst_buffer_map (buffer, &map, GST_MAP_READ);

        printf("map.size = %lu\n", map.size);
        //frameCount++;
        //printf("%d\n",frameCount);
        gst_buffer_unmap(buffer, &map);

        gst_sample_unref (sample);
    }
    else
    {
        g_print ("could not make snapshot\n");
    }
    double endTime = msTime();
   // printf("end time is %f\n",endTime);
  //  printf("%f\n",endTime-startTime);
    spendTime = (endTime - startTime);
    if(spendTime > 1000)
    {
        printf("spendTime = %f\n",spendTime);
        printf("framerate is %d\n",num);
        startTime = 0;
        num = 0;
    }
    return GST_FLOW_OK;
}

    static gboolean my_bus_callback (GstBus *bus, GstMessage *message, gpointer data)
    {
        g_print ("Got %s message from %s\n", GST_MESSAGE_TYPE_NAME (message), GST_OBJECT_NAME (message->src));
        switch (GST_MESSAGE_TYPE (message))
        {
                case GST_MESSAGE_ERROR:
                {
                        GError *err;
                        gchar *debug;

                        gst_message_parse_error (message, &err, &debug);
                        g_print ("Error from %s: %s\n", GST_OBJECT_NAME (message->src), err->message);
                        g_error_free (err);
                        g_free (debug);
                        break;
                }
                case GST_MESSAGE_EOS:
                        /* end-of-stream */
                        //quit_flag = 1;
                        break;
                case GST_MESSAGE_STATE_CHANGED:
                        GstState oldstate, newstate;
                        gst_message_parse_state_changed(message, &oldstate, &newstate, NULL);
                        g_print ("Element %s changed state from %s to %s.\n",
                        GST_OBJECT_NAME (message->src),
                                gst_element_state_get_name (oldstate),
                                gst_element_state_get_name (newstate));
                        break;
                default:
                        /* unhandled message */
                        break;
        }
        /* we want to be notified again the next time there is a message
        * on the bus, so returning TRUE (FALSE means we want to stop watching
        * for messages on the bus and our callback should not be called again)
        */
        return TRUE;
    }

    int main (int argc, char *argv[])
    {
        gst_init (&argc, &argv);

        GMainLoop *main_loop;
        main_loop = g_main_loop_new (NULL, FALSE);
        ostringstream launch_stream;
        int w = 1920;
        int h = 1080;
        GstAppSinkCallbacks callbacks = {appsink_eos, NULL, new_buffer};

        launch_stream
        << "nvcamerasrc ! "
        //<< "rtspsrc location=rtsp://admin:admin12345@192.168.0.64:554/Streaming/Channels/101?transportmode=unicast&profile=Profile_1 latency=0 ! "
        //<< "decodebin ! "
        << "video/x-raw(memory:NVMM), width="<< w <<", height="<< h <<", framerate=120/1 ! " 
        << "nvvidconv ! "
        << "video/x-raw, format=I420, width="<< w <<", height="<< h <<" ! "
        << "appsink name=mysink";
        
        launch_string = launch_stream.str();

        g_print("Using launch string: %s\n", launch_string.c_str());

        GError *error = NULL;
        gst_pipeline  = (GstPipeline*) gst_parse_launch(launch_string.c_str(), &error);

        if (gst_pipeline == NULL) {
            g_print( "Failed to parse launch: %s\n", error->message);
            return -1;
        }
        if(error) g_error_free(error);

        GstElement *appsink_ = gst_bin_get_by_name(GST_BIN(gst_pipeline), "mysink");
        gst_app_sink_set_callbacks (GST_APP_SINK(appsink_), &callbacks, NULL, NULL);

        GstBus *bus;
        guint bus_watch_id;
        bus = gst_pipeline_get_bus (GST_PIPELINE (gst_pipeline));
        bus_watch_id = gst_bus_add_watch (bus, my_bus_callback, NULL);
        gst_object_unref (bus);

        gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_PLAYING); 

        //sleep(10);
        g_main_loop_run (main_loop);

        gst_element_set_state((GstElement*)gst_pipeline, GST_STATE_NULL);
        gst_object_unref(GST_OBJECT(gst_pipeline));
        g_main_loop_unref(main_loop);

        g_print("going to exit \n");

        return 0;
    }

Work well,when I open my on board camera with “nvcamerasrc”.This is debug info.

nvidia@tegra-ubuntu:~/target/nfsroot/projects/ControlIPCamera$ ./ProcessFrame
Using launch string: nvcamerasrc ! video/x-raw(memory:NVMM), width=1920, height=1080, framerate=120/1 ! nvvidconv ! video/x-raw, format=I420, width=1920, height=1080 ! appsink name=mysink

Available Sensor modes : 
2592 x 1944 FR=30.000000 CF=0x1109208a10 SensorModeType=4 CSIPixelBitDepth=10 DynPixelBitDepth=10
2592 x 1458 FR=30.000000 CF=0x1109208a10 SensorModeType=4 CSIPixelBitDepth=10 DynPixelBitDepth=10
1280 x 720 FR=120.000000 CF=0x1109208a10 SensorModeType=4 CSIPixelBitDepth=10 DynPixelBitDepth=10
Got state-changed message from mysink
Element mysink changed state from NULL to READY.
Got state-changed message from capsfilter1
Element capsfilter1 changed state from NULL to READY.
Got state-changed message from nvvconv0
Element nvvconv0 changed state from NULL to READY.
Got state-changed message from capsfilter0
Element capsfilter0 changed state from NULL to READY.
Got state-changed message from nvcamerasrc0
Element nvcamerasrc0 changed state from NULL to READY.
Got state-changed message from pipeline0
Element pipeline0 changed state from NULL to READY.
Got state-changed message from capsfilter1
Element capsfilter1 changed state from READY to PAUSED.
Got state-changed message from nvvconv0
Element nvvconv0 changed state from READY to PAUSED.
Got state-changed message from capsfilter0
Element capsfilter0 changed state from READY to PAUSED.
Got stream-status message from src
Got state-changed message from nvcamerasrc0
Element nvcamerasrc0 changed state from READY to PAUSED.
Got state-changed message from pipeline0
Element pipeline0 changed state from READY to PAUSED.
Got new-clock message from pipeline0
Got state-changed message from capsfilter1
Element capsfilter1 changed state from PAUSED to PLAYING.
Got state-changed message from nvvconv0
Element nvvconv0 changed state from PAUSED to PLAYING.
Got state-changed message from capsfilter0
Element capsfilter0 changed state from PAUSED to PLAYING.
Got state-changed message from nvcamerasrc0
Element nvcamerasrc0 changed state from PAUSED to PLAYING.
Got stream-status message from src
Got stream-start message from pipeline0

NvCameraSrc: Trying To Set Default Camera Resolution. Selected 1920x1080 FrameRate = 120.000000 ...

Got state-changed message from mysink
Element mysink changed state from READY to PAUSED.
Got async-done message from pipeline0
Got state-changed message from mysink
Element mysink changed state from PAUSED to PLAYING.
Got state-changed message from pipeline0
map.size = 3110400
Element pipeline0 changed state from PAUSED to PLAYING.
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
spendTime = 1016.859024
framerate is 33
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
spendTime = 1032.271685
framerate is 32
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400

However,when I open my IP camera with “rtspsrc”, the framerate is only about 11fps.This is debug info.

nvidia@tegra-ubuntu:~/target/nfsroot/projects/ControlIPCamera$ ./ProcessFrame
Using launch string: rtspsrc location=rtsp://admin:admin12345@192.168.0.64:554/Streaming/Channels/101?transportmode=unicast&profile=Profile_1 latency=0 ! decodebin ! nvvidconv ! video/x-raw, format=I420, width=1920, height=1080 ! appsink name=mysink
Got state-changed message from mysink
Element mysink changed state from NULL to READY.
Got state-changed message from capsfilter0
Element capsfilter0 changed state from NULL to READY.
Got state-changed message from nvvconv0
Element nvvconv0 changed state from NULL to READY.
Got state-changed message from typefind
Element typefind changed state from NULL to READY.
Got state-changed message from decodebin0
Element decodebin0 changed state from NULL to READY.
Got state-changed message from rtspsrc0
Element rtspsrc0 changed state from NULL to READY.
Got state-changed message from pipeline0
Element pipeline0 changed state from NULL to READY.
Got state-changed message from capsfilter0
Element capsfilter0 changed state from READY to PAUSED.
Got state-changed message from nvvconv0
Element nvvconv0 changed state from READY to PAUSED.
Got state-changed message from typefind
Element typefind changed state from READY to PAUSED.
Got progress message from rtspsrc0
Got state-changed message from rtspsrc0
Element rtspsrc0 changed state from READY to PAUSED.
Got state-changed message from pipeline0
Element pipeline0 changed state from READY to PAUSED.
Got new-clock message from pipeline0
Got state-changed message from capsfilter0
Element capsfilter0 changed state from PAUSED to PLAYING.
Got state-changed message from nvvconv0
Element nvvconv0 changed state from PAUSED to PLAYING.
Got progress message from rtspsrc0
Got state-changed message from rtspsrc0
Element rtspsrc0 changed state from PAUSED to PLAYING.
Got progress message from rtspsrc0
Got progress message from rtspsrc0
Got progress message from rtspsrc0
Got progress message from rtspsrc0
Got state-changed message from manager
Element manager changed state from NULL to READY.
Got state-changed message from manager
Element manager changed state from READY to PAUSED.
Got state-changed message from rtpssrcdemux0
Element rtpssrcdemux0 changed state from NULL to READY.
Got state-changed message from rtpssrcdemux0
Element rtpssrcdemux0 changed state from READY to PAUSED.
Got state-changed message from rtpsession0
Element rtpsession0 changed state from NULL to READY.
Got state-changed message from rtpsession0
Element rtpsession0 changed state from READY to PAUSED.
Got progress message from rtspsrc0
Got state-changed message from udpsink0
Element udpsink0 changed state from NULL to READY.
Got state-changed message from udpsink0
Element udpsink0 changed state from READY to PAUSED.
Got state-changed message from udpsink0
Element udpsink0 changed state from PAUSED to PLAYING.
Got state-changed message from fakesrc0
Element fakesrc0 changed state from NULL to READY.
Got stream-status message from src
Got state-changed message from fakesrc0
Element fakesrc0 changed state from READY to PAUSED.
Got state-changed message from fakesrc0
Element fakesrc0 changed state from PAUSED to PLAYING.
Got progress message from rtspsrc0
Got stream-status message from src
Got state-changed message from rtpssrcdemux0
Element rtpssrcdemux0 changed state from PAUSED to PLAYING.
Got state-changed message from rtpsession0
Element rtpsession0 changed state from PAUSED to PLAYING.
Got state-changed message from manager
Element manager changed state from PAUSED to PLAYING.
Got stream-status message from src
Got state-changed message from udpsrc0
Element udpsrc0 changed state from READY to PAUSED.
Got state-changed message from udpsrc0
Element udpsrc0 changed state from PAUSED to PLAYING.
Got stream-status message from src
Got stream-status message from src
Got state-changed message from udpsrc1
Element udpsrc1 changed state from READY to PAUSED.
Got state-changed message from udpsrc1
Element udpsrc1 changed state from PAUSED to PLAYING.
Got progress message from rtspsrc0
Got stream-status message from src
Got state-changed message from rtpptdemux0
Element rtpptdemux0 changed state from NULL to READY.
Got state-changed message from rtpptdemux0
Element rtpptdemux0 changed state from READY to PAUSED.
Got state-changed message from rtpptdemux0
Element rtpptdemux0 changed state from PAUSED to PLAYING.
Got state-changed message from rtpjitterbuffer0
Element rtpjitterbuffer0 changed state from NULL to READY.
Got stream-status message from src
Got state-changed message from rtpjitterbuffer0
Element rtpjitterbuffer0 changed state from READY to PAUSED.
Got stream-status message from src
Got state-changed message from rtpjitterbuffer0
Element rtpjitterbuffer0 changed state from PAUSED to PLAYING.
Got state-changed message from rtph264depay0
Element rtph264depay0 changed state from NULL to READY.
Got state-changed message from rtph264depay0
Element rtph264depay0 changed state from READY to PAUSED.
Got state-changed message from h264parse0
Element h264parse0 changed state from NULL to READY.
Got state-changed message from h264parse0
Element h264parse0 changed state from READY to PAUSED.
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingGot state-changed message from omxh264dec-omxh264dec0
Element omxh264dec-omxh264dec0 changed state from NULL to READY.
Got state-changed message from omxh264dec-omxh264dec0
Element omxh264dec-omxh264dec0 changed state from READY to PAUSED.
NvMMLiteOpen : Block : BlockType = 261 
TVMR: NvMMLiteTVMRDecBlockOpen: 7818: NvMMLiteBlockOpen 
NvMMLiteBlockCreate : Block : BlockType = 261 
Got stream-status message from src
Got stream-status message from src
TVMR: cbBeginSequence: 1190: BeginSequence  1920x1088, bVPR = 0
TVMR: LowCorner Frequency = 180000 
TVMR: cbBeginSequence: 1583: DecodeBuffers = 5, pnvsi->eCodec = 4, codec = 0 
TVMR: cbBeginSequence: 1654: Display Resolution : (1920x1080) 
TVMR: cbBeginSequence: 1655: Display Aspect Ratio : (1920x1080) 
TVMR: cbBeginSequence: 1697: ColorFormat : 5 
TVMR: cbBeginSequence:1702 ColorSpace = NvColorSpace_YCbCr709_ER
TVMR: cbBeginSequence: 1839: SurfaceLayout = 3
TVMR: cbBeginSequence: 1936: NumOfSurfaces = 9, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1, BitDepthForSurface = 8 LumaBitDepth = 8, ChromaBitDepth = 8, ChromaFormat = 5
TVMR: cbBeginSequence: 1938: BeginSequence  ColorPrimaries = 1, TransferCharacteristics = 1, MatrixCoefficients = 1
Allocating new output: 1920x1088 (x 9), ThumbnailMode = 0
Got state-changed message from decodebin0
Element decodebin0 changed state from READY to PAUSED.
Got stream-start message from pipeline0
Got tag message from mysink
Got state-changed message from mysink
Element mysink changed state from READY to PAUSED.
Got async-done message from pipeline0
Got state-changed message from mysink
Element mysink changed state from PAUSED to PLAYING.
map.size = 3110400
Got state-changed message from omxh264dec-omxh264dec0
Element omxh264dec-omxh264dec0 changed state from PAUSED to PLAYING.
Got state-changed message from capsfilter1
Element capsfilter1 changed state from PAUSED to PLAYING.
Got state-changed message from h264parse0
Element h264parse0 changed state from PAUSED to PLAYING.
Got state-changed message from rtph264depay0
Element rtph264depay0 changed state from PAUSED to PLAYING.
Got state-changed message from typefind
Element typefind changed state from PAUSED to PLAYING.
Got state-changed message from decodebin0
Element decodebin0 changed state from PAUSED to PLAYING.
Got state-changed message from pipeline0
Element pipeline0 changed state from PAUSED to PLAYING.
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
spendTime = 1098.384610
framerate is 13
map.size = 3110400
TVMR: FrameRate = 25 
TVMR: NVDEC LowCorner Freq = (150000 * 1024) 
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
spendTime = 1014.651784
framerate is 11
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
spendTime = 1076.942673
framerate is 12
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
map.size = 3110400
spendTime = 1001.644204
framerate is 11
map.size = 3110400
map.size = 3110400
map.size = 3110400

The result is unacceptable in my real-time program.I hope the framerate is about 25fps or more.How to solve it? Could you give me more suggestions?

For using opencv, please other users can share experience.

We would suggest you try VisioWorks or tegra_multimedia_api

Hi DaneLLL,
I don’t use opencv,and just changed the pipeline.

launch_stream
        << "rtspsrc location=rtsp://admin:admin12345@192.168.0.64:554/Streaming/Channels/101?transportmode=unicast&profile=Profile_1 latency=0 ! "
        << "decodebin ! "
        << "nvvidconv ! "
        << "video/x-raw, format=I420, width="<< w <<", height="<< h <<" ! "
        << "appsink name=mysink";
        
        launch_string = launch_stream.str();

I wonder whether or not my pipeline is wrong?I hope the framerate is about 25fps or more.How to configure the pipeline? Could you give me more suggestions? Thanks for you in advance.

Hi Holy,
We don’t observe the issue in decoding the clip bourne_ultimatum_trailer.zip - Download The Bourne Ultimatum - High Definition (1080p) Theatrical Trailer - dvdloc8.com

ubuntu@tegra-ubuntu:~$ ./test
Using launch string: filesrc location=Bourne_Trailer.mp4 ! decodebin ! nvvidconv ! video/x-raw, format=I420, width=1920, height=816 ! appsink name=mysink
Inside NvxLiteH264DecoderLowLatencyInitNvxLiteH264DecoderLowLatencyInit set DPB and MjstreamingInside NvxLiteH265DecoderLowLatencyInitNvxLiteH265DecoderLowLatencyInit set DPB and MjstreamingNvMMLiteOpen : Block : BlockType = 261
TVMR: NvMMLiteTVMRDecBlockOpen: 7580: NvMMLiteBlockOpen
NvMMLiteBlockCreate : Block : BlockType = 261
TVMR: cbBeginSequence: 1166: BeginSequence  1920x816, bVPR = 0, fFrameRate = 23.975986
TVMR: LowCorner Frequency = 180000
TVMR: cbBeginSequence: 1545: DecodeBuffers = 2, pnvsi->eCodec = 4, codec = 0
TVMR: cbBeginSequence: 1606: Display Resolution : (1920x816)
TVMR: cbBeginSequence: 1607: Display Aspect Ratio : (1920x816)
TVMR: cbBeginSequence: 1649: ColorFormat : 5
TVMR: cbBeginSequence:1660 ColorSpace = NvColorSpace_YCbCr709
TVMR: cbBeginSequence: 1790: SurfaceLayout = 3
TVMR: cbBeginSequence: 1868: NumOfSurfaces = 6, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1, BitDepthForSurface = 8 LumaBitDepth = 8, ChromaBitDepth = 8, ChromaFormat = 5
Allocating new output: 1920x816 (x 8), ThumbnailMode = 0
TVMR: FrameRate = 23
TVMR: NVDEC LowCorner Freq = (138000 * 1024)
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: FrameRate = 23.976043
TVMR: NvMMLiteTVMRDecDoWork: 6466: NVMMLITE_TVMR: EOS detected
TVMR: TVMRBufferProcessing: 5444: Processing of EOS
TVMR: TVMRBufferProcessing: 5519: Processing of EOS Done
app sink receive eos
TVMR: TVMRFrameStatusReporting: 6067: Closing TVMR Frame Status Thread -------------
TVMR: TVMRVPRFloorSizeSettingThread: 5885: Closing TVMRVPRFloorSizeSettingThread -------------
TVMR: TVMRFrameDelivery: 5917: Closing TVMR Frame Delivery Thread -------------
TVMR: NvMMLiteTVMRDecBlockClose: 7740: Done
going to exit, decode 2136 frames in 90 seconds

It decodes 2136 frames / 90 seconds ~= 23.73 fps. Please refer t the attachment.
test2.cpp (2.95 KB)

Hi,DaneLLL,
Thanks for you so much.It does work well.Now,I can real-time(about 26fps) decode my IP camera rtsp stream.Thanks for your suggestions again.

Hi Holy_Chen,

I am also facing the long latency problem with RTSP camera. Could you share your code of RTSP pipeline processing?

Thanks.

Hi,
Can you show your code about facing the long latency problem? Here is sample about processing usb camera and borad camera pipline https://github.com/dusty-nv/jetson-inference/tree/master/util/camera. Hope it can help you.

Hi Holy_Chen,

I solved the problem by referring to [url]https://github.com/dusty-nv/jetson-inference/tree/master/util/camera[/url].

I think the reason for long latency is that I did opencv frame creation and imshow inside the gstreamer new_sample callback, which makes the callback blocked.

dusty-nv uses a ring buffer and fetches frame from the ring buffer outside the callback. Which can avoid callback blocking.

Thanks.

hello ShayWang
currently I am facing the same latency problem
can you shared the solution you found
thanks.