Jetson Nano+gstreamer+opencv: using nvjpegenc to encode frames from opencv as appsrc

Hi,

I am trying to write a C++ program in my Jetson Nano which does the following:

  1. Receives video from camera and converts it to opencv Mat
  2. For each image obtained: Detects and/or tracks a specific object in it and draws a bounding box around the object.
  3. Outputs the images with the bounding boxes to a gstreamer pipeline which encodes those images to jpeg and then outputs those jpegs to some sink, for example, a tcpserversink.

So far, steps 1 and 2 work very well but in step 3 I seem to get a “JPEG parameter struct mismatch” error from gstreamer (more details will come later when I will have access to my computer).

For step 1 I wrote the following code which works fine:

VideoCapture input_video;
input_video.open("nvarguscamerasrc ! video/x-raw(memory:NVMM), width=800, height=450, framerate=60/1, format=NV12 ! nvvidconv ! videoconvert ! appsink emit-signals=true sync=false max-buffers=2 drop=true", CAP_GSTREAMER);

Step 3 was done by using OpenCv’s VideoWriter class.
Initializing the VideoWriter:

cv::VideoWriter video_writer;
video_writer.open("appsrc ! nvjpegenc ! tcpserversink host=0.0.0.0 port=5000", CAP_GSTREAMER, 0, (double)60, cv::Size(800, 450), true);

For each image obtained (as a Mat object, with a bounding box drawn in it) I outputted it to the VideoWriter

video_writer << frame;

Where frame is the Mat object representing the frame.

Hi,
The format of appsrc should be BGR. Please check if converting it to I420 and then sending to nvjpegenc works.

video_writer.open("appsrc ! video/x-raw,format=BGR ! videoconvert ! video/x-raw,format=I420 ! nvjpegenc ! tcpserversink host=0.0.0.0 port=5000", CAP_GSTREAMER, 0, (double)60, cv::Size(800, 450), true);

Hi,

What you suggested still did not solve the problem. However, if I use jpegenc instead of nvjpegenc, the pipeline you suggested does work, but a little slowly - probably because jpegenc is not hardware accelerated. I want to use nvjpegenc but I still get the same error.

Here is what I get when I run my code with nvjpegenc:

Unable to open ofstream to Arduino
nvbuf_utils: Could not get EGL display connection
GST_ARGUS: Creating output stream
CONSUMER: Waiting until producer is connected...
GST_ARGUS: Available Sensor modes :
GST_ARGUS: 3280 x 2464 FR = 21.000000 fps Duration = 47619048 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 3280 x 1848 FR = 28.000001 fps Duration = 35714284 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1920 x 1080 FR = 29.999999 fps Duration = 33333334 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 59.999999 fps Duration = 16666667 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: 1280 x 720 FR = 120.000005 fps Duration = 8333333 ; Analog Gain range min 1.000000, max 10.625000; Exposure Range min 13000, max 683709000;

GST_ARGUS: Running with following settings:
   Camera index = 0 
   Camera mode  = 4 
   Output Stream W = 1280 H = 720 
   seconds to Run    = 0 
   Frame Rate = 120.000005 
GST_ARGUS: PowerService: requested_clock_Hz=4725000
GST_ARGUS: Setup Complete, Starting captures for 0 seconds
GST_ARGUS: Starting repeat capture requests.
CONSUMER: Producer has connected; continuing.
JPEG parameter struct mismatch: library thinks size is 584, caller expects 712
CONSUMER: Done Success
WARNING Argus: 5 client objects still exist during shutdown:
	547703353848 (0x7f9401cc68)
	547708321024 (0x7f94001690)
	547708321184 (0x7f94001730)
	547708324960 (0x7f94001890)
	547708326256 (0x7f9401cbd0)

Hi,
Please check if you include header files in tegra_multimedia_api\include\libjpeg-8b
and has ‘-lnvjpeg’ in Makefile.