Code to send a bayer video feed from a TX1 to an h.264 encoder to an RTP sink

I just wanted to share the results of what I’ve been working on for a week or so and I want to thank all those that helped me get to this point. Basically, this code takes a bayer (rggb) video feed, converts it to YUV, compresses it, and sends it through a real-time transfer protocol (RTP) to a sink on a remote computer. I really struggled with the nuances so I will also explain the functions of each section of code as best I can. Let me know if you have any questions or suggestions.

Note: replacing all references to 264 with 265 changed encoding to HEVC.

compile line:

g++ -g -Wall clean.cpp -o clean `pkg-config --cflags --libs gstreamer-1.0`

This is the pipeline I built to receive the RTP stream on my host computer (remote).

gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H264, payload = 96 ! rtph264depay ! h264parse ! queue ! avdec_h264 ! xvimagesink sync=false async=false -e

To display and record on the host (remote):

gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,encoding-name=H265, payload = 96 ! rtph265depay ! h265parse ! tee name=t t. ! queue ! avdec_h265 ! xvimagesink sync=false async=false t. ! queue ! matroskamux ! filesink location=120.mp4 -e

This is the code running on the TX1.
gst/gst.h is required for gstreamer

#include <gst/gst.h>
//#include <gst/app/gstappsrc.h>
#include <stdio.h>
#include <iostream>
#include <sstream>
//#include <time.h>
#include <signal.h>
#include <string.h>

use quotes to define an ip address

#define CLIENT_IP "130.134.235.179"
#define TARGET_BITRATE 15000000
#define WIDTH 1504
#define HEIGHT 1504

loop is the function that runs while the pipeline is in operation and always define pointers to gstreamer objects. Each GstElement and each GstCaps would be separated by “!” in a pipeline. The bus is for monitoring the pipeline and to command the exit.

static GMainLoop *loop;
GstElement *pipeline, *source, *bayer_filter, *bayer2rgb_conv, *Nvidia_vid_conv, *encoder,*YUV_filter, *Ethernet_filter, *parser, *rtp264, *sink;
GstCaps *source_caps, *bayer_filter_caps, *YUV_filter_caps, *Ethernet_filter_caps;
GstBus *bus;
GstStateChangeReturn ret;
guint bus_watch_id;

This section of code handles “ctrl+c” to exit the pipeline

static void sigint_restore (void)
{
  struct sigaction action;

  memset (&action, 0, sizeof (action));
  action.sa_handler = SIG_DFL;

  sigaction (SIGINT, &action, NULL);
}
/* Signal handler for ctrl+c */
void intHandler(int dummy) {	
	//! Emit the EOS signal which tells all the elements to shut down properly:
	printf("Sending EOS signal to shutdown pipeline cleanly\n");
	gst_element_send_event(pipeline, gst_event_new_eos());
	sigint_restore();
	return;
}
static gboolean bus_call	(GstBus     *bus, GstMessage *msg, gpointer data)
{
	GMainLoop *loop = (GMainLoop *) data;
	switch (GST_MESSAGE_TYPE (msg)) {
		case GST_MESSAGE_EOS:
			g_print ("End of stream\n");
			g_main_loop_quit (loop);
		        break;
		case GST_MESSAGE_ERROR: {
			gchar  *debug;
			GError *error;
			gst_message_parse_error (msg, &error, &debug);
			g_free (debug);
			g_printerr ("Error: %s\n", error->message);
			g_error_free (error);
			g_main_loop_quit (loop);
		        break;
		}	
		default:
		break;
	}
	return TRUE;
}
int watcher_make()
{
	/* we add a message handler */
	bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
	bus_watch_id = gst_bus_add_watch (bus, bus_call, loop);
	gst_object_unref (bus);	
	return 0;
}

The main function looks like this. Signal is to handle “ctrl+c” key events. gst_init always gets &argc, and *argv. Set up the main loop. Create all elements. The first argument to the element factory is the gstreamer object and the second object is an arbitrary name for the object. I think it is used in tracebacks. Available objects and their properties can be found via gst-inspect-1.0 from the command line.

int main(int argc, char *argv[]) {
	signal(SIGINT, intHandler);
	/* Initialize GStreamer */
	gst_init (&argc, &argv);
	loop = g_main_loop_new (NULL, FALSE);
	/* Create the elements */
	//	source = gst_element_factory_make ("appsrc", "source");
	source   = gst_element_factory_make ("videotestsrc",	"source");
	bayer_filter  = gst_element_factory_make ("capsfilter", "bayer_filter");
	bayer2rgb_conv	 = gst_element_factory_make ("bayer2rgb", "bayer2rgb_conv");
	Nvidia_vid_conv	 = gst_element_factory_make ("nvvidconv", "Nvidia_vid_conv");
	YUV_filter	 = gst_element_factory_make ("capsfilter", "YUV_filter");	
	encoder  = gst_element_factory_make ("omxh264enc", "encoder");
	Ethernet_filter	 = gst_element_factory_make ("capsfilter", "Ethernet_filter");
	parser	 = gst_element_factory_make ("h264parse", "parser");
	rtp264 	 = gst_element_factory_make ("rtph264pay", "rtp264");
	sink 	 = gst_element_factory_make ("udpsink", "sink");

create the pipeline, check that all the elements have been initialized, build and link the pipeline

/* Create the empty pipeline */
	pipeline = gst_pipeline_new ("pipeline");

	if (!pipeline || !source || !bayer_filter || !bayer2rgb_conv || !Nvidia_vid_conv || !YUV_filter || !encoder || !Ethernet_filter || !parser || !rtp264 || !sink) {
	g_printerr ("Not all elements could be created.\n");
	return -1;
	}
	/* Build the pipeline */
	gst_bin_add_many (GST_BIN (pipeline), source, bayer_filter, bayer2rgb_conv, Nvidia_vid_conv, YUV_filter, encoder, Ethernet_filter, parser, rtp264, sink, NULL);
	/* Link the elements together */
	gst_element_link_many (source, bayer_filter, bayer2rgb_conv, Nvidia_vid_conv, YUV_filter, encoder, Ethernet_filter, parser, rtp264, sink, NULL);

set the caps properties. You have to use gst_caps_from_string to set video/x-raw(memory:NVMM) properties.

/* Modify the caps properties */
	source_caps = gst_caps_new_simple ("video/x-bayer",
					 "format", G_TYPE_STRING, "rggb",
					 "width", G_TYPE_INT, WIDTH,
					 "height", G_TYPE_INT, HEIGHT,
					 "framerate", GST_TYPE_FRACTION, 120, 1,
					 NULL);
	bayer_filter_caps = gst_caps_new_simple ("video/x-bayer","format", G_TYPE_STRING, "rggb",	"width", G_TYPE_INT, WIDTH,	"height", G_TYPE_INT, HEIGHT, "framerate", GST_TYPE_FRACTION, 60, 1, NULL);
	YUV_filter_caps = gst_caps_from_string("video/x-raw(memory:NVMM),format=I420");
	Ethernet_filter_caps = gst_caps_new_simple ("video/x-h264",
		"stream-format", G_TYPE_STRING, "byte-stream",
		NULL);

To actually set the properties defined above, use g_object_set. The first argument is the type, the second is a pre-defined property for that type, ref: gst-inspect-1.0. The third is the variable name from above and the last is always a NULL to indicate the end of the list of properties.

g_object_set (G_OBJECT (source), "source", source_caps, NULL);
	g_object_set (G_OBJECT (bayer_filter), "caps", bayer_filter_caps, NULL);
	g_object_set (G_OBJECT (YUV_filter), "caps", YUV_filter_caps, NULL);
	g_object_set (G_OBJECT (Ethernet_filter), "caps", Ethernet_filter_caps, NULL);
	g_object_set (G_OBJECT (encoder), "bitrate", TARGET_BITRATE, "control-rate", 2, NULL);
	g_object_set( G_OBJECT(rtp264), "pt", 96, "config-interval",1,NULL);
  	g_object_set( G_OBJECT(sink), "host", CLIENT_IP, "port", 5000, "sync",FALSE, "async", FALSE, NULL);

watcher make monitors the keyboard for keypresses

/* Add function to watch bus */
	if(watcher_make() != 0)
		return -1;  	
	/* Start playing */
	ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
	if (ret == GST_STATE_CHANGE_FAILURE) {
	g_printerr ("Unable to set the pipeline to the playing state.\n");
	gst_object_unref (pipeline);
	return -1;
	}
	g_main_loop_run(loop);

dealocate memory before exiting for a graceful shutdown

/* Free resources */
	gst_caps_unref (bayer_filter_caps);
	gst_caps_unref (YUV_filter_caps);	
	gst_caps_unref (Ethernet_filter_caps);
	gst_element_set_state (pipeline, GST_STATE_NULL);
	gst_object_unref (GST_OBJECT (pipeline));
	g_main_loop_unref (loop);

	return 0;
}

For completeness, the entire, uninterrupted code follows:

#include <gst/gst.h>
//#include <gst/app/gstappsrc.h>
#include <stdio.h>
#include <iostream>
#include <sstream>
//#include <time.h>
#include <signal.h>
#include <string.h>

#define CLIENT_IP "130.134.235.179"
#define TARGET_BITRATE 15000000
#define WIDTH 1504
#define HEIGHT 1504

static GMainLoop *loop;
GstElement *pipeline, *source, *bayer_filter, *bayer2rgb_conv, *Nvidia_vid_conv, *encoder,*YUV_filter, *Ethernet_filter, *parser, *rtp264, *sink;
GstCaps *source_caps, *bayer_filter_caps, *YUV_filter_caps, *Ethernet_filter_caps;
GstBus *bus;

GstStateChangeReturn ret;
guint bus_watch_id;

static void sigint_restore (void)
{
  struct sigaction action;

  memset (&action, 0, sizeof (action));
  action.sa_handler = SIG_DFL;

  sigaction (SIGINT, &action, NULL);
}

/* Signal handler for ctrl+c */
void intHandler(int dummy) {	
	//! Emit the EOS signal which tells all the elements to shut down properly:
	printf("Sending EOS signal to shutdown pipeline cleanly\n");
	gst_element_send_event(pipeline, gst_event_new_eos());
	sigint_restore();
	return;
}

static gboolean bus_call	(GstBus     *bus, GstMessage *msg, gpointer data)
{
	GMainLoop *loop = (GMainLoop *) data;

	switch (GST_MESSAGE_TYPE (msg)) {

		case GST_MESSAGE_EOS:
			g_print ("End of stream\n");
			g_main_loop_quit (loop);
		break;

		case GST_MESSAGE_ERROR: {
			gchar  *debug;
			GError *error;

			gst_message_parse_error (msg, &error, &debug);
			g_free (debug);

			g_printerr ("Error: %s\n", error->message);
			g_error_free (error);

			g_main_loop_quit (loop);
		break;
		}
		
		default:
		break;
	}

	return TRUE;
}
int watcher_make()
{
	/* we add a message handler */
	bus = gst_pipeline_get_bus (GST_PIPELINE (pipeline));
	bus_watch_id = gst_bus_add_watch (bus, bus_call, loop);
	gst_object_unref (bus);	
	return 0;
}
int main(int argc, char *argv[]) {
	signal(SIGINT, intHandler);

	/* Initialize GStreamer */
	gst_init (&argc, &argv);
	loop = g_main_loop_new (NULL, FALSE);

	/* Create the elements */
	//	source = gst_element_factory_make ("appsrc", "source");
	source   = gst_element_factory_make ("videotestsrc",	"source");
	bayer_filter  = gst_element_factory_make ("capsfilter", "bayer_filter");
	bayer2rgb_conv	 = gst_element_factory_make ("bayer2rgb", "bayer2rgb_conv");
	Nvidia_vid_conv	 = gst_element_factory_make ("nvvidconv", "Nvidia_vid_conv");
	YUV_filter	 = gst_element_factory_make ("capsfilter", "YUV_filter");	
	encoder  = gst_element_factory_make ("omxh264enc", "encoder");
	Ethernet_filter	 = gst_element_factory_make ("capsfilter", "Ethernet_filter");
	parser	 = gst_element_factory_make ("h264parse", "parser");
	rtp264 	 = gst_element_factory_make ("rtph264pay", "rtp264");
	sink 	 = gst_element_factory_make ("udpsink", "sink");	

	/* Create the empty pipeline */
	pipeline = gst_pipeline_new ("pipeline");

	if (!pipeline || !source || !bayer_filter || !bayer2rgb_conv || !Nvidia_vid_conv || !YUV_filter || !encoder || !Ethernet_filter || !parser || !rtp264 || !sink) {
	g_printerr ("Not all elements could be created.\n");
	return -1;
	}

	/* Build the pipeline */
	gst_bin_add_many (GST_BIN (pipeline), source, bayer_filter, bayer2rgb_conv, Nvidia_vid_conv, YUV_filter, encoder, Ethernet_filter, parser, rtp264, sink, NULL);

	/* Link the elements together */
	gst_element_link_many (source, bayer_filter, bayer2rgb_conv, Nvidia_vid_conv, YUV_filter, encoder, Ethernet_filter, parser, rtp264, sink, NULL);	

	/* Modify the caps properties */
	source_caps = gst_caps_new_simple ("video/x-bayer",
					 "format", G_TYPE_STRING, "rggb",
					 "width", G_TYPE_INT, WIDTH,
					 "height", G_TYPE_INT, HEIGHT,
					 "framerate", GST_TYPE_FRACTION, 120, 1,
					 NULL);

	bayer_filter_caps = gst_caps_new_simple ("video/x-bayer","format", G_TYPE_STRING, "rggb",	"width", G_TYPE_INT, WIDTH,	"height", G_TYPE_INT, HEIGHT, "framerate", GST_TYPE_FRACTION, 60, 1, NULL);
	
	YUV_filter_caps = gst_caps_from_string("video/x-raw(memory:NVMM),format=I420");

	Ethernet_filter_caps = gst_caps_new_simple ("video/x-h264",
		"stream-format", G_TYPE_STRING, "byte-stream",
		NULL);

	g_object_set (G_OBJECT (source), "source", source_caps, NULL);
	g_object_set (G_OBJECT (bayer_filter), "caps", bayer_filter_caps, NULL);
	g_object_set (G_OBJECT (YUV_filter), "caps", YUV_filter_caps, NULL);
	g_object_set (G_OBJECT (Ethernet_filter), "caps", Ethernet_filter_caps, NULL);
	g_object_set (G_OBJECT (encoder), "bitrate", TARGET_BITRATE, "control-rate", 2, NULL);
	g_object_set( G_OBJECT(rtp264), "pt", 96, "config-interval",1,NULL);
  	g_object_set( G_OBJECT(sink), "host", CLIENT_IP, "port", 5000, "sync",FALSE, "async", FALSE, NULL);	
	/* Add function to watch bus */
	if(watcher_make() != 0)
		return -1;  	

	/* Start playing */
	ret = gst_element_set_state (pipeline, GST_STATE_PLAYING);
	if (ret == GST_STATE_CHANGE_FAILURE) {
	g_printerr ("Unable to set the pipeline to the playing state.\n");
	gst_object_unref (pipeline);
	return -1;
	}
	g_main_loop_run(loop);

	/* Free resources */

	gst_caps_unref (bayer_filter_caps);
	gst_caps_unref (YUV_filter_caps);	
	gst_caps_unref (Ethernet_filter_caps);
	gst_element_set_state (pipeline, GST_STATE_NULL);
	gst_object_unref (GST_OBJECT (pipeline));
	g_main_loop_unref (loop);

	return 0;
}

Thanks for posting this! I’ve just been using gst-launch scripts, but I’ll want to make a nice executable later. This will save me some time.

This is a very helpful post. Thanks a lot! After several days of research your use case seems the closest to my own. I’m wondering if you might have a look at my use case before I dive in.

I am using a USB 3.0 camera and CUDA to do some image processing on the live stream. I am using openGL to display the processed image buffer on a TX1 and I’d also like to send the processed image buffer over WIFI UDP using h264/265 to a laptop. I’ve been wrestling a lot with how to do this in a way that A) makes sure I use the hardware acceleration for h264/265 encoding provided by the TX1, and B) doesn’t require I write a gstreamer plugin (things get complicated because I am already using UVC, openGL, CUDA, etc). This seems like a common use case for a TX1, but I’ve found the documentation a bit murky.

You commented out a line containing “appsrc”. I know how to prepare an external buffer for gstreamer, but would you please let me know how to modify the pipeline caps such that this buffer properly goes through NVMM and the hardware encoder? I think I get it but any flags about pitfalls would be appreciated. Ideally, I just want to use gstreamer to take care of video encoding and streaming over WIFI UDP. Lastly, I’d modify the code to avoid a gst_main_loop like in the example code linked below:

[url]https://gist.github.com/floe/e35100f091315b86a5bf[/url]

For some reason your comment wasn’t visible when I left my original response. But now I can see it…

You have an image buffer you want to encode and send over wifi. I would expect the pipeline would look a lot like mine once you get the image buffer into a format compatible with nvvidconv. Is your image buffer in RGB?
I haven’t started on the appsrc element yet but I have to incorporate it with PointGrey’s camera control software, flycapture2. I am not sure how involved that will be, but I’ll let you know what I come up with. How do you control your USB Camera? Is the github code functional?

I see. I will be working on this use case over the coming days. I will let you know what I find. Please consider sharing as well.

Hi Dan,

I’m sure you’re following the other thread you posted regarding getting an appsrc program going, but just in case, I responded to [url]https://devtalk.nvidia.com/default/topic/1002805/jetson-tx1/mjstreamingerror-internal-data-flow-error-/?offset=9#5133272[/url].

Incidentally, I’m using libuvc to grab frames from my camera. I then process them in an openGL display loop using cuda kernels. It is the output of these kernels that I wish to 1) stream over UDP and 2) render using openGL to a display. The latter I have down, but getting this gstreamer h264 HW encoding and streaming to work is being a pain. Did you get yours working? I can’t even get a toy case to work.

I am returning to this task after a couple of weeks working on another project. It looks like I implemented the Point Grey Flycapture2 API into my videotestsrc code. It compiles and captures frames, but I haven’t figured out how to get frames into my gstreamer pipeline.
fc2_gst.cpp (10.4 KB)

Hi Dan,

To get frames into gst pipeline the element appsrc should be properly configured.
It means your application needs to set “blocksize”, “is-live” if required (means you need to choose fps originator: gstreamer thread or camera capture thread).
From your code it looks like you want 120 fps synced by gstreamer, but the framerate originator is actually a camera (fps configuration should exist for this camera).
Thus you may set appsrc parameter “is-live” to TRUE, set appsrc parameter “do-timestamp” to TRUE (means it’s up to gstreamer to set correct timestamps for buffers) and create a while loop thread with the code like this:

FlyCapture2::Image frame;
GstBuffer buffer;
GstFlowReturn ret;
int size = width
height; /* match to “blocksize” of appsrc element */

while (1) {
grabFrame (&frame);
buffer = gst_buffer_new_allocate (NULL, size, NULL);
gst_buffer_fill (buffer, 0, frame.GetData(), size);
g_signal_emit_by_name (source /your appsrc element/, “push-buffer”, buffer, &ret);
gst_buffer_unref (buffer);
/* break when you need, or if (ret != GST_FLOW_OK)*/
}

Hello Dan,

thanks for your Topic.i have a Question if i can run your code on the Jetson nano or it musst be changed ,i’m new in nvidia .

thnaks a lot

Dan, I love you! Thank you!

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.