SOLVED: Harware-accelerated video encoding from C++ buffer with gstreamer omxh264enc and filesink

First, let me clear my throat.

$cat /etc/nv_tegra_release 
# R28 (release), REVISION: 1.0, GCID: 9379712, BOARD: t186ref, EABI: aarch64, DATE: Thu Jul 20 07:59:31 UTC 2017
7f8fb47183cbd3d4cacc5eb50fc7869aacdefe40 */usr/lib/aarch64-linux-gnu/libv4l/plugins/libv4l2_nvvideocodec.so
c33af0b5f9a88ab8ff62b31c16c635aa2744902d */usr/lib/aarch64-linux-gnu/libv4l/plugins/libv4l2_nvvidconv.so
678e42e35687f11c9b5b602a539a56cc3de1188f */usr/lib/aarch64-linux-gnu/tegra/libnvomx.so
ccecdd04fb3ef95308a25a96ccf6670310400ba1 */usr/lib/aarch64-linux-gnu/tegra/libnveglstreamproducer.so
1653e5d266b7b030dc882d226b03f0c0157b4f3a */usr/lib/aarch64-linux-gnu/tegra/libnvtx_helper.so
fc646aa1d227d10ee3c338373e721ffade0b36d5 */usr/lib/aarch64-linux-gnu/tegra/libnvddk_vic.so
c255aeebc742731b2a1e796816178b3b8ffa7dea */usr/lib/aarch64-linux-gnu/tegra/libglx.so
6cc4acdeee4908f9c4ff4f4dcfedef71be190354 */usr/lib/aarch64-linux-gnu/tegra/libargus_socketserver.so
5b66cf6e49430ca8918835a8ea287b51c8b3b941 */usr/lib/aarch64-linux-gnu/tegra/libnvmmlite.so
9f8b91a4b08d160d5f473b43e9874a5c24c66e88 */usr/lib/aarch64-linux-gnu/tegra/libnvddk_2d_v2.so
a3a2931e3fe2e5d40f6783d7d6ce3639ac88f0ed */usr/lib/aarch64-linux-gnu/tegra/libnvwinsys.so
7259ca958e6e595bfd3f50b914b9f51b12419ba8 */usr/lib/aarch64-linux-gnu/tegra/libargus.so
1a599b8a1f7e5abbbc8b3e51d3f48bcc5124f51a */usr/lib/aarch64-linux-gnu/tegra/libnvmm.so
97cf051cc8ac5aecf158bc3c85feb83b89fefdd9 */usr/lib/aarch64-linux-gnu/tegra/libnvjpeg.so
3c48f3b81b1b7b333df5b261dc920736194e1f95 */usr/lib/aarch64-linux-gnu/tegra/libnvexif.so
cbe774108f73fe79b48ae7357e08e9413344dabc */usr/lib/aarch64-linux-gnu/tegra/libnvdc.so
8a9db15bf96f8c89967e96f1f55a116f0ad8853c */usr/lib/aarch64-linux-gnu/tegra/libnvavp.so
429e26afbfaf88b8ace2e0fa35207a2499cec9e1 */usr/lib/aarch64-linux-gnu/tegra/libnvtestresults.so
3f48c67f1c1650df51586078d1d8d3bd3740b025 */usr/lib/aarch64-linux-gnu/tegra/libargus_socketclient.so
4afd017ec9e9a16138da168c29a70f6bef4bd868 */usr/lib/aarch64-linux-gnu/tegra/libnvmm_utils.so
4336787797e9727d6fead71a027a5b5a10105a21 */usr/lib/aarch64-linux-gnu/tegra/libnvfnet.so
0eeac3a25c46c2095db087f69fcb2da8e7c51855 */usr/lib/aarch64-linux-gnu/tegra/libnvll.so
1f2fc2642f5cd373b5db26921d85014b6abf840d */usr/lib/aarch64-linux-gnu/tegra/libnvcameratools.so
d5c04359d52a3d594fa091a18653426262a7197a */usr/lib/aarch64-linux-gnu/tegra/libnvapputil.so
8f47a0da6cc1e75b4ebd2ae33f83503249b6d5ee */usr/lib/aarch64-linux-gnu/tegra/libnveglstream_camconsumer.so
0667aa3962b8eda69023ac17830a2efc016cad4f */usr/lib/aarch64-linux-gnu/tegra/libnvrm.so
8653db228561f903a452d4cab1cc632ca3315881 */usr/lib/aarch64-linux-gnu/tegra/libnvcam_imageencoder.so
fd427c65f562573a5826fd29fc5410f0290ad52f */usr/lib/aarch64-linux-gnu/tegra/libtegrav4l2.so
183da5b0281e0ee120545e2eaa99f56a0bb89d02 */usr/lib/aarch64-linux-gnu/tegra/libnvparser.so
beb1786a7d0e9464e98bdf3dda5d11c994069b8a */usr/lib/aarch64-linux-gnu/tegra/libnvtvmr.so
07c1e569a35cb39c77728ecbb7212f6339c8fd68 */usr/lib/aarch64-linux-gnu/tegra/libnvrm_gpu.so
ae214a66a4fe6ef66c15ea40a0a03dadb8055f72 */usr/lib/aarch64-linux-gnu/tegra/libnvtnr.so
ad4f99d3c3a6daa5829678a0defd1b2345b1c3b1 */usr/lib/aarch64-linux-gnu/tegra/libnvcamerautils.so
cb725c103def5f9c8f0e25205b3b39eab4642721 */usr/lib/aarch64-linux-gnu/tegra/libnvidia-egl-wayland.so
9673606cfb805c3e1563fcdf1256cfb6c95fecc9 */usr/lib/aarch64-linux-gnu/tegra/libnvfnetstoredefog.so
f213ecec058176a1830e0621907f28176f57ff7e */usr/lib/aarch64-linux-gnu/tegra/libnvodm_imager.so
a54283f9ed83ef15d6bd433d97e6a53e73176219 */usr/lib/aarch64-linux-gnu/tegra/libnvmmlite_utils.so
532626aba510a1b8d586c04b23011ad7f48ff351 */usr/lib/aarch64-linux-gnu/tegra/libnvcolorutil.so
cc9f715b1fd1b9719ff845f1b9c07c3f3162fe2e */usr/lib/aarch64-linux-gnu/tegra/libnvmmlite_video.so
24237d67a163325fa3bf5194c5934bb944f71b58 */usr/lib/aarch64-linux-gnu/tegra/libnvmmlite_image.so
24a0ba50281234b4fe3205032a39eb88b89d2fd5 */usr/lib/aarch64-linux-gnu/tegra/libnvmm_contentpipe.so
cde93d596b8976787dc3f5f5bff1a3ec49dc8a13 */usr/lib/aarch64-linux-gnu/tegra/libnvimp.so
340c90216c7662a2ae6df0d9f5db961c9c7b3752 */usr/lib/aarch64-linux-gnu/tegra/libnvos.so
331481e2895586a29de020f40a1a288e2fc8d58b */usr/lib/aarch64-linux-gnu/tegra/libnvrm_graphics.so
56d4dd97a4073b605a4c906caaee0224affda2a4 */usr/lib/aarch64-linux-gnu/tegra/libnvmedia.so
7c627627fbc26a280c5f395ba04ab01891f4341f */usr/lib/aarch64-linux-gnu/tegra/libnvfnetstorehdfx.so
0c3deb2a856368700fcc226110238e3299054b1b */usr/lib/aarch64-linux-gnu/tegra/libnvomxilclient.so
4f20b8cc95d69177ce108423cf5bac116e58a8c9 */usr/lib/aarch64-linux-gnu/tegra/libnvcamlog.so
1afa41bd35fc74e4f978875c0a6db0a8997201e5 */usr/lib/aarch64-linux-gnu/tegra/libnvmm_parser.so
d95121ac07e17d56500763b74d2fb29159fea85d */usr/lib/aarch64-linux-gnu/tegra/libscf.so
65acd5f0844c6dc12b71cf6fa46baf0d6c8e9a70 */usr/lib/aarch64-linux-gnu/tegra/libnvosd.so
3ef04ac64cac4cbe8f5c25414d2b71373d3a99a0 */usr/lib/xorg/modules/drivers/nvidia_drv.so
c255aeebc742731b2a1e796816178b3b8ffa7dea */usr/lib/xorg/modules/extensions/libglx.so

In a short way: “What takes RGB data as input and outputs H264 video, using hardware encoding?”

Next, I would like to list a few concepts I have about encoding video and see if the kind community would mind confirming or denying.

  • There are four main ways to encode H264 on the jetson.
  • Tegra Multimedia API - using Video4Linux2 as the data source
  • Gstreamer 1.0 command line, specifying input and output files and capability/format strings
  • Gstreamer 1.0 appsink/appsrc, using C++ code to interface with gstreamer
  • OpenMax - interfacing directly with the openmax hardware driver and deliver data.

Are all of the above accurate?

If you feel I have a grasp on my surroundings, please help me find a solution to this issue:

  • I am using c++ to write my program.
  • I have a pointer to a data block of known length containing formatted RGB data.
  • I would like to pass this pointer to any of the above methods of encoding.
  • The source data at the pointer can be a single frame, or multiple frames in sequence
  • The output would not be returned to the program, but written to a file.

For argument sake, lets say the RGB data is ARGB8888, meaning 4 bytes to each pixel. Each byte represents a single color channel (0-255). This could easily be any other format that will work as input for the encoder.

The most promising approach I have found is gstreamer appsrc reading data from a C program buffer, but I am very unclear on how to implement that. I have checked the documentation on that (not nvidia’s realm) and fail to see what I am missing. It frankly starts talking about things I don’t understand.

I would like to use the Tegra MultiMedia API (high level) but am unsure how to eliminate the v4L2 as input, and provide my own data. I see no function in the docs that would lead me to believe I can insert data in this way.

For clarity I am able to use gstreamer on the command line with the omxh264enc module to produce a correct h264 video, using a different test file as input.

Would any experienced heads care to clear up some of my misunderstandings?

Additionally, this is just a test case. I will work from any solution I find into delivering source data frame-by-frame to the H264 encoder.

– Below is some code I am working with to achieve my solution, but am failing :'(
– These are snippets of a larger program i am writing.

-- Unsure how to get rid of v4l2, as it seems thats all TegraMMAPI understands.
	int colorWidth = 1280;
	int colorHeight = 720;

	context_t encoderctx;
	int ret = 0;
	int error = 0;
	bool eos = false;

	//Give it what it needs to breathe, man
	memset(&encoderctx, 0, sizeof(context_t));

	encoderctx.in_file_path = "sourceframes.data";
	encoderctx.width = colorWidth;
	encoderctx.height = colorHeight;
	//V4L2_ARGB-8888 format... (Dont ask why its BA24...)
	unsigned int formatInput = ( 
				   (  __u32)('B')        | 
				   ( (__u32)('A') << 8 ) | 
				   ( (__u32)('2') << 16) | 
				   ( (__u32)('4') << 24) );
	encoderctx.encoder_pixfmt = formatInput;
	//encoderctc.encoder_pixfmt = V4L2_PIX_FMT_H264
	encoderctx.out_file_path = "video.h264";
	encoderctx.fps_n = 1;
	encoderctx.fps_d = 1;
	encoderctx.bitrate = 4 * 1024 * 1024;    

	encoderctx.in_file = new ifstream(encoderctx.in_file_path);
	encoderctx.out_file = new ofstream(encoderctx.out_file_path);
	encoderctx.enc = NvVideoEncoder::createVideoEncoder("enc0");

	// Set encoder capture plane format
	ret = encoderctx.enc->setCapturePlaneFormat(	encoderctx.encoder_pixfmt, 
							encoderctx.width, encoderctx.height, 
							2 * 1024 * 1024 );
	// Set encoder output plane format
	unsigned int outputFormat = ( 
				(  __u32)('Y')        | 
				( (__u32)('M') << 8 ) | 
				( (__u32)('1') << 16) | 
				( (__u32)('2') << 24) );
	ret = encoderctx.enc->setOutputPlaneFormat(	outputFormat , 
							encoderctx.width, encoderctx.height );

	ret = encoderctx.enc->setBitrate(encoderctx.bitrate);
	//if (encoderctx.encoder_pixfmt == V4L2_PIX_FMT_H264) {
	//	ret = encoderctx.enc->setProfile(V4L2_MPEG_VIDEO_H264_PROFILE_HIGH);
	//}
	//else {
	ret = encoderctx.enc->setProfile(V4L2_MPEG_VIDEO_H265_PROFILE_MAIN);
	//}

	//if (encoderctx.encoder_pixfmt == V4L2_PIX_FMT_H264) {
	//	ret = encoderctx.enc->setLevel(V4L2_MPEG_VIDEO_H264_LEVEL_5_0);
	//}

	ret = encoderctx.enc->setFrameRate(encoderctx.fps_n, encoderctx.fps_d);

	// Query, Export and Map the output plane buffers so that we can read
	// raw data into the buffers
	ret = encoderctx.enc->output_plane.setupPlane(V4L2_MEMORY_MMAP, 10, true, false);
	// Query, Export and Map the output plane buffers so that we can write
	// encoded data from the buffers
	ret = encoderctx.enc->capture_plane.setupPlane(V4L2_MEMORY_MMAP, 10, true, false);

	// output plane STREAMON
	ret = encoderctx.enc->output_plane.setStreamStatus(true);

	// capture plane STREAMON
	ret = encoderctx.enc->capture_plane.setStreamStatus(true);

	encoderctx.enc->capture_plane.
	setDQThreadCallback(encoder_capture_plane_dq_callback);

	// startDQThread starts a thread internally which calls the
	// encoder_capture_plane_dq_callback whenever a buffer is dequeued
	// on the plane
	encoderctx.enc->capture_plane.startDQThread(&encoderctx);

	// Enqueue all the empty capture plane buffers
	for (uint32_t i = 0; i < encoderctx.enc->capture_plane.getNumBuffers(); i++) {
		struct v4l2_buffer v4l2_buf;
		struct v4l2_plane planes[MAX_PLANES];
	
		memset(&v4l2_buf, 0, sizeof(v4l2_buf));
		memset(planes, 0, MAX_PLANES * sizeof(struct v4l2_plane));
		v4l2_buf.index = i;
		v4l2_buf.m.planes = planes;
		///! Inject data here?
		ret = encoderctx.enc->capture_plane.qBuffer(v4l2_buf, NULL);
		if (ret < 0) {
			printf("Error while queueing buffer at capture plane\n");
			abort(&encoderctx);
			goto cleanup;
		}
	}

	// Read video frame and queue all the output plane buffers
	for (	uint32_t i = 0; 
		i < encoderctx.enc->output_plane.getNumBuffers() && !encoderctx.got_error; 
		i++ ) {
		
		struct v4l2_buffer v4l2_buf;
		struct v4l2_plane planes[MAX_PLANES];
		NvBuffer *buffer = encoderctx.enc->output_plane.getNthBuffer(i);
		memset(&v4l2_buf, 0, sizeof(v4l2_buf));
		memset(planes, 0, MAX_PLANES * sizeof(struct v4l2_plane));
		v4l2_buf.index = i;
		v4l2_buf.m.planes = planes;
		if (read_video_frame(encoderctx.in_file, *buffer) < 0) {
			printf("Could not read complete frame from input file\n");
			v4l2_buf.m.planes[0].bytesused = 0;
		}

		ret = encoderctx.enc->output_plane.qBuffer(v4l2_buf, NULL);
		if (ret < 0) {
			printf("Error while queueing buffer at output plane\n");
			abort(&encoderctx);
			goto cleanup;
		}

		if (v4l2_buf.m.planes[0].bytesused == 0) {
			printf("File read complete.\n");
			eos = true;
			break;
		}
	}
-- Gstreamer plugin with appsink, looks simple, but its not to me. :''(
  app = new _App();

    app->src = (GstAppSrc*)gst_element_factory_make ("appsrc", "source");

    app->encoder = gst_element_factory_make("omx264enc", "encoder");

    app->sink = (GstAppSink*)gst_element_factory_make ("filesink", "sink");

if (!app->pipeline || !app->src || !app->encoder || !app->sink)
        return;

app->bus = gst_pipeline_get_bus (GST_PIPELINE (app->pipeline));
    g_assert(app->bus);
    gst_bus_add_watch (app->bus, (GstBusFunc) BusMessage, this);

    gst_bin_add_many (GST_BIN (app->pipeline), (GstElement*)app->src, app->encoder, app->sink ,NULL);

// SETUP ELEMENTS     

    g_object_set(app->src,
        "stream-type", 0,
        "format", GST_FORMAT_BUFFERS, 
        "is-live", true,
        "block", true,
        NULL);

    g_object_set(app->src, "caps", gst_caps_new_simple("video/x-h264",
            NULL), NULL);

    g_signal_connect(app->src, "need-data", G_CALLBACK(StartFeed), this);
    g_signal_connect(app->src, "enough-data", G_CALLBACK(StopFeed), this);

g_object_set (app->sink, 
        "location", GenerateFileName().c_str(),
        "buffer-mode", 0,
        NULL);

//PLAY            

GstStateChangeReturn ret = gst_element_set_state (app->pipeline, GST_STATE_PLAYING);

    if (ret == GST_STATE_CHANGE_FAILURE) 
    {
        gst_object_unref (app->pipeline);
        return;
    }

I would be happy to paypal anyone some beer money for an outright solution. This may be as simple as “have you seen the blurpaderp method on the doodanger class in the tegraMMAPI?”

Thanks for reading my first nvidia post. :^)

https://gstreamer.freedesktop.org/data/doc/gstreamer/head/qt-gstreamer/html/examples_2appsink-src_2main_8cpp-example.html

This will be my digest and tinker for tonight. It looks simple enough…

Will post answers if I find any.

I have made some more searches over in GStreamer land, and apparently I have sniffed something out that will help me.

http://gstreamer-devel.966125.n4.nabble.com/appsrc-usage-push-and-pull-mode-td4662768.html

This guy asks about the push and pull methods of interfacing with appsrc.

Hi techtruth,
On TX2, we support

  • Tegra Multimedia API - using Video4Linux2 as the data source
  • Gstreamer 1.0 command line, specifying input and output files and capability/format strings
  • Gstreamer 1.0 appsink/appsrc, using C++ code to interface with gstreamer

and don’t support OpenMax.

The input to encoder have to be YUV420(I420 or NV12). you can use nvvidconv in gstreamer or NvVideoconverted in v4l2 to do conversion.

Thanks DaneLLL. You have confirmed my thoughts that I should use appsink/appsrc to produce H264 encoded video to a file, from a c++ program.

You have also saved me some time in figuring out which input format I should use. I would have been stuffing RGB into it and wondering whats wrong. haha YUV420(I420) or NV12 it is. I will likely use NV12, because I sense special magic afoot in how the encoder handles the data in that format.

I will also look into nvvidconv. Thank you for sharing these.

I have started to adapt the appsrc/appsync code from the above mailing list link. I will post some generic results to maybe make this a bit easier for the next guy.

Is there any resource or docs that you can think of that I perhaps missed on my search?

Here are two samples of using appsink:
[url]https://devtalk.nvidia.com/default/topic/1011376/jetson-tx1/gstreamer-decode-live-video-stream-with-the-delay-difference-between-gst-launch-1-0-command-and-appsink-callback/post/5160929/#5160929[/url]
[url]NVMM memory - Jetson TX1 - NVIDIA Developer Forums

For saving H264 stream to a file, you can simply use filesink.

Ok, thanks to all those who gave info, and those struggling with me.

I believe I have a very bad demo program put together. I say very bad, because it will only work half the time. :-D So I am still fighting bugs. But, I do successfully transform a series of I420 formatted data into a h264 video. So yay!

I will update this if I can fix the bugs, otherwise i think this is a good start for anyone on this same path. Just doing my part to help the internet here. :-)

The program below outputs a h264 encoded .mov file. It also takes a I420 encoded source video as input. You can generate a I420 video by running this command

gst-launch-1.0 videotestsrc num-buffers=200 ! 'video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, framerate=(fraction)1/1' ! filesink location=test.yuv -e

Compile the source code below with the following g++ line:

g++ -std=c++11 -I/usr/include/gstreamer-1.0 -I/usr/lib/aarch64-linux-gnu/gstreamer-1.0/include -I/usr/include/glib-2.0 -I/usr/lib/aarch64-linux-gnu/glib-2.0/include  appsrc.cpp -pthread -lgstreamer-1.0 -lgobject-2.0 -lglib-2.0 -lgstapp-1.0 -lgstbase-1.0

Source File: appsrc.cpp

#include <stdio.h>
#include <string.h>
#include <fstream>
#include <unistd.h>
#include <gst/gst.h>
#include <gst/app/gstappsrc.h>

typedef struct {
	GstPipeline *pipeline;
	GstAppSrc  *src;
	GstElement *filter1;
	GstElement *encoder;
	GstElement *filter2;
	GstElement *parser;
	GstElement *qtmux;
	GstElement *sink;

	GstClockTime timestamp;
	guint sourceid;
} gst_app_t;

static gst_app_t gst_app;

int main()
{
	gst_app_t *app = &gst_app;
	GstStateChangeReturn state_ret;
	gst_init(NULL, NULL); //Initialize Gstreamer
	app->timestamp = 0; //Set timestamp to 0

	//Create pipeline, and pipeline elements
	app->pipeline = (GstPipeline*)gst_pipeline_new("mypipeline");
	app->src    =   (GstAppSrc*)gst_element_factory_make("appsrc", "mysrc");
	app->filter1 =  gst_element_factory_make ("capsfilter", "myfilter1");
	app->encoder =  gst_element_factory_make ("omxh264enc", "myomx");
	app->filter2 =  gst_element_factory_make ("capsfilter", "myfilter2");
	app->parser =   gst_element_factory_make("h264parse"  , "myparser");
	app->qtmux =    gst_element_factory_make("qtmux"      , "mymux");
	app->sink =     gst_element_factory_make ("filesink"  , NULL);
	
	if(	!app->pipeline || 
		!app->src      || !app->filter1 || 
		!app->encoder  || !app->filter2 || 
		!app->parser   || !app->qtmux    || 
		!app->sink    )  {
		printf("Error creating pipeline elements!\n");
		exit(2);
	}

	//Attach elements to pipeline
	gst_bin_add_many(
		GST_BIN(app->pipeline), 
		(GstElement*)app->src,
		app->filter1,	
		app->encoder,
		app->filter2,	
		app->parser,
		app->qtmux,
		app->sink,
		NULL);

	//Set pipeline element attributes
	g_object_set (app->src, "format", GST_FORMAT_TIME, NULL);
	GstCaps *filtercaps1 = gst_caps_new_simple ("video/x-raw",
		"format", G_TYPE_STRING, "I420",
		"width", G_TYPE_INT, 1280,
		"height", G_TYPE_INT, 720,
		"framerate", GST_TYPE_FRACTION, 1, 1,
		NULL);
	g_object_set (G_OBJECT (app->filter1), "caps", filtercaps1, NULL);
	GstCaps *filtercaps2 = gst_caps_new_simple ("video/x-h264",
		"stream-format", G_TYPE_STRING, "byte-stream",
		NULL);
	g_object_set (G_OBJECT (app->filter2), "caps", filtercaps2, NULL);
	g_object_set (G_OBJECT (app->sink), "location", "output.h264", NULL);

	//Link elements together
	g_assert( gst_element_link_many(
		(GstElement*)app->src, 
		app->filter1,
		app->encoder,
		app->filter2,
		app->parser,
		app->qtmux,
		app->sink,
		NULL ) );

	//Play the pipeline
	state_ret = gst_element_set_state((GstElement*)app->pipeline, GST_STATE_PLAYING);
	g_assert(state_ret == GST_STATE_CHANGE_ASYNC);

	//Get a pointer to the test input
	FILE *testfile = fopen("test.yuv", "rb");	
	g_assert(testfile != NULL);

	//Push the data from buffer to gstpipeline 100 times
	for(int i = 0; i < 100; i++) {
		char* filebuffer = (char*)malloc (1382400); //Allocate memory for framebuffer
		if (filebuffer == NULL) {printf("Memory error\n"); exit (2);} //Errorcheck
		size_t bytesread = fread(filebuffer, 1 , (1382400), testfile); //Read to filebuffer
		//printf("File Read: %zu bytes\n", bytesread);

		GstBuffer *pushbuffer; //Actual databuffer
		GstFlowReturn ret; //Return value
		pushbuffer = gst_buffer_new_wrapped (filebuffer, 1382400); //Wrap the data

		//Set frame timestamp
		GST_BUFFER_PTS      (pushbuffer) = app->timestamp;
		GST_BUFFER_DTS      (pushbuffer) = app->timestamp;	
		GST_BUFFER_DURATION (pushbuffer) = gst_util_uint64_scale_int (1, GST_SECOND, 1);
		app->timestamp += GST_BUFFER_DURATION (pushbuffer);
		//printf("Frame is at %lu\n", app->timestamp);

		ret = gst_app_src_push_buffer( app->src, pushbuffer); //Push data into pipeline

		g_assert(ret ==  GST_FLOW_OK);
	}
	usleep(100000);
	
	//Declare end of stream
	gst_app_src_end_of_stream (GST_APP_SRC (app->src));
	printf("End Program.\n");

return 0;

}

Again, there are probably logical errors above.

Thanks!

Well, not sure what the bugs are, but it seems to be around the caps set at the end of the run. When I don’t get the following bit of output, my video wont play. When i do, it works fine.

0:00:00.399322176 28488       0x4ba800 INFO               GST_EVENT gstevent.c:679:gst_event_new_caps: creating caps event video/quicktime, variant=(string)apple, streamheader=(buffer)< 000006436d6f6f760000006c6d76686400000000d5f20babd5f20bab000007080002bf200001000001000000000000000000000000010000000000000000000000000000000100000000000000000000000000004000000000000000000000000000000000000000000000000000000000000002000005927472616b0000005c746b686400000007d5f20babd5f20bab00000001000000000002bf20000000000000000000000000000000000001000000000000000000000000000000010000000000000000000000000000400000000500000002d00000000004f16d646961000000206d64686400000000d5f20babd5f20bab000003e8000186a0000000000000002d68646c72000000006d686c72766964650000000000000000000000000c566964656f48616e646c65720000049c6d696e6600000014766d68640000000100408000800080000000002168646c720000000064686c72616c6973000000000000000000000000000000002464696e660000001c6472656600000000000000010000000c616c6973000000010000043b7374626c0000009b7374736400000000000000010000008b61766331000000000000000100000000000000000000020000000200050002d0004800000048000000000000000100000000000000000000000000000000000000000000000000000000000000000018ffff000000216176634301424034ffe1000a6742403495a014016e4001000468ce3c80000000146274727400000000000a64d000089f450000001873747473000000000000000100000064000003e800000020737473730000000000000004000000010000001f0000003d0000005b0000001c737473630000000000000001000000010000000100000001000001a47374737a00000000000000000000006400014b8a000111e1000111ca0001113b00011149000111ec0001124c000111220001111400011136000110d0000111810001112c00011130000111520001117a000111a7000111d100011108000110f6000111c300011129000111dd000111a3000111690001116e00011186000111090001114b000110d500014c8b0001117c000111630001111b000111b2000111bf0001113c0001110f000110fe000111560001119b000111a2000111ba0001109a0001117d000110d80001111a0001118f000111930001117e0001115d000111dd0001111c00011279000111490001120e0001127c0001115d000111870001116700014c9a0001113d0001110b000110fc0001118a000111d30001119c0001113800011233000111630001108300011153000111e50001116e00011114000111a500011183000110db0001115e000111f70001114b000111ad00011130000111f8000111a2000111440001114c0001116e0001115e0001110b00014c160001113f0001125e000110a9000110d3000110e20001115a000111390001120b0001111d000001a07374636f00000000000000640000002400014bae00025d8f00036f5900048094000591dd0006a3c90007b6150008c7370009d84b000ae981000bfa51000d0bd2000e1cfe000f2e2e00103f80001150fa001262a1001374720014857a001596700016a8330017b95c0018cb390019dcdc001aee45001bffb3001d1139001e2242001f338d00204462002190ed0022a2690023b3cc0024c4e70025d6990026e8580027f99400290aa3002a1ba1002b2cf7002c3e92002d5034002e61ee002f728800308405003194dd0032a5f70033b7860034c9190035da970036ebf40037fdd100390eed003a2166003b32af003c44bd003d5739003e6896003f7a1d00408b840041d81e0042e95b0043fa6600450b6200461cec00472ebf0048405b00495193004a63c6004b7529004c85ac004d96ff004ea8e4004fba520050cb660051dd0b0052ee8e0053ff69005510c7005622be00573409005845b6005956e6005a68de005b7a80005c8bc4005d9d10005eae7e005fbfdc0060d0e700621cfd00632e3c0064409a0065514300666216006772f8006884520069958b006aa7960000003d75647461000000356d657461000000000000002168646c72000000006d686c726d6469720000000000000000000000000000000008696c73740000003d75647461000000356d657461000000000000002168646c72000000006d686c726d6469720000000000000000000000000000000008696c7374 >

Very handy for this sort of thing is

export GST_DEBUG=3

adding usleep(100000); after the gst_app_src_end_of_stream() line seems to have fixed the bug. It makes sense I would need to let the pipeline finish processing, I suppose.

#include <stdio.h>
#include <string.h>
#include <fstream>
#include <unistd.h>
#include <gst/gst.h>
#include <gst/app/gstappsrc.h>

#define BUFF_SIZE (1024)

typedef struct {
	GstPipeline *pipeline;
	GstAppSrc  *src;
	GstElement *filter1;
	GstElement *encoder;
	GstElement *filter2;
	GstElement *parser;
	GstElement *qtmux;
	GstElement *sink;

	GstClockTime timestamp;
	guint sourceid;
} gst_app_t;

static gst_app_t gst_app;

int main()
{
	gst_app_t *app = &gst_app;
	GstStateChangeReturn state_ret;
	gst_init(NULL, NULL); //Initialize Gstreamer
	app->timestamp = 0; //Set timestamp to 0

	//Create pipeline, and pipeline elements
	app->pipeline = (GstPipeline*)gst_pipeline_new("mypipeline");
	app->src    =   (GstAppSrc*)gst_element_factory_make("appsrc", "mysrc");
	app->filter1 =  gst_element_factory_make ("capsfilter", "myfilter1");
	app->encoder =  gst_element_factory_make ("omxh264enc", "myomx");
	app->filter2 =  gst_element_factory_make ("capsfilter", "myfilter2");
	app->parser =   gst_element_factory_make("h264parse"  , "myparser");
	app->qtmux =    gst_element_factory_make("qtmux"      , "mymux");
	app->sink =     gst_element_factory_make ("filesink"  , NULL);
	
	if(	!app->pipeline || 
		!app->src      || !app->filter1 || 
		!app->encoder  || !app->filter2 || 
		!app->parser   || !app->qtmux    || 
		!app->sink    )  {
		printf("Error creating pipeline elements!\n");
		exit(2);
	}

	//Attach elements to pipeline
	gst_bin_add_many(
		GST_BIN(app->pipeline), 
		(GstElement*)app->src,
		app->filter1,	
		app->encoder,
		app->filter2,	
		app->parser,
		app->qtmux,
		app->sink,
		NULL);

	//Set pipeline element attributes
	g_object_set (app->src, "format", GST_FORMAT_TIME, NULL);
	GstCaps *filtercaps1 = gst_caps_new_simple ("video/x-raw",
		"format", G_TYPE_STRING, "I420",
		"width", G_TYPE_INT, 1280,
		"height", G_TYPE_INT, 720,
		"framerate", GST_TYPE_FRACTION, 1, 1,
		NULL);
	g_object_set (G_OBJECT (app->filter1), "caps", filtercaps1, NULL);
	GstCaps *filtercaps2 = gst_caps_new_simple ("video/x-h264",
		"stream-format", G_TYPE_STRING, "byte-stream",
		NULL);
	g_object_set (G_OBJECT (app->filter2), "caps", filtercaps2, NULL);
	g_object_set (G_OBJECT (app->sink), "location", "output.h264", NULL);

	//Link elements together
	g_assert( gst_element_link_many(
		(GstElement*)app->src, 
		app->filter1,
		app->encoder,
		app->filter2,
		app->parser,
		app->qtmux,
		app->sink,
		NULL ) );

	//Play the pipeline
	state_ret = gst_element_set_state((GstElement*)app->pipeline, GST_STATE_PLAYING);
	g_assert(state_ret == GST_STATE_CHANGE_ASYNC);

	//Get a pointer to the test input
	FILE *testfile = fopen("test.yuv", "rb");	
	g_assert(testfile != NULL);

	//Push the data from buffer to gstpipeline 100 times
	for(int i = 0; i < 100; i++) {
		char* filebuffer = (char*)malloc (1382400); //Allocate memory for framebuffer
		if (filebuffer == NULL) {printf("Memory error\n"); exit (2);} //Errorcheck
		size_t bytesread = fread(filebuffer, 1 , (1382400), testfile); //Read to filebuffer
		//printf("File Read: %zu bytes\n", bytesread);

		GstBuffer *pushbuffer; //Actual databuffer
		GstFlowReturn ret; //Return value
		pushbuffer = gst_buffer_new_wrapped (filebuffer, 1382400); //Wrap the data

		//Set frame timestamp
		GST_BUFFER_PTS      (pushbuffer) = app->timestamp;
		GST_BUFFER_DTS      (pushbuffer) = app->timestamp;	
		GST_BUFFER_DURATION (pushbuffer) = gst_util_uint64_scale_int (1, GST_SECOND, 1);
		app->timestamp += GST_BUFFER_DURATION (pushbuffer);
		//printf("Frame is at %lu\n", app->timestamp);

		ret = gst_app_src_push_buffer( app->src, pushbuffer); //Push data into pipeline

		g_assert(ret ==  GST_FLOW_OK);
	}
	
	//Declare end of stream
	gst_app_src_end_of_stream (GST_APP_SRC (app->src));
	printf("End Program.\n");
	usleep(100000);

return 0;
}

Thank you for sharing the code.

Thanks for the code you shared. I was able to build it with slightly modified command

g++ -std=c++11 -I/usr/include/gstreamer-1.0 -I/usr/lib/aarch64-linux-gnu/gstreamer-1.0/include -I/usr/include/glib-2.0 -I/usr/lib/aarch64-linux-gnu/glib-2.0/include  appsrc.cpp -pthread -lgstreamer-1.0 -lgobject-2.0 -lglib-2.0 -lgstapp-1.0 -lgstbase-1.0 -ldrm

(basically only -ldrm added).
But I am getting the empty output.h264 on exit. Do you have an idea what might go wrong?

I’m not sure that you will need to add -ldrm if you have not modified the code above… Sorry if I misunderstand you.

Gstreamer will make a blank file before it has any data to process using the filesink. Once you push frames into the pipeline, all should “just work”.

Did you generate the test.yuv file? If you are missing that, I bet it would make a fuss about it.

I figured it out - was all my fault. BTW you can use the build command with pkg-config

g++ --std=c++14 -Wall $(pkg-config --cflags gstreamer-1.0) appsrc.cpp $(pkg-config --libs gstreamer-1.0) -lgstapp-1.0

I like that much better.

Oh cool, glad you got it figured out! :)

I do like pkg-config, but tend to just hand-spin the lib and include flags. A handy tool for sure! I definitely dont know all the little features it has hidden away.

What ended up being your problem? I remember running into the video files being empty before.

Thank you for this example, very helpful!

I got this working just fine but I have two questions. When I run it with the GST_DEBUG turned on, I see the following warning. Should I be concerned or is that config not needed generally?

0:00:00.039275758  7942       0x7ef180 WARN                     omx gstomx.c:2836:plugin_init: Failed to load configuration file: Valid key file could not be found in search dirs (searched in: /home/nvidia/.config:/etc/xdg as per GST_OMX_CONFIG_DIR environment variable, the xdg user config directory (or XDG_CONFIG_HOME) and the system config directory (or XDG_CONFIG_DIRS)

Also, I have tried replacing all instances of “264” with “265”, but then I get an assertion on gst_element_link_many (without any other useful debug messages). What more might be needed to setup h.265 encoding correctly, and/or how can I debug this?

Thank you!

I’ll quickly eat my own question: This was a muxing problem. I had to change qtmux to matroskamux. Works!

Hi @techtruth,

Thank you for sharing your code, it is very helpful.

Regards,

is there a better way than usleep to wait for gstreamer pipeline finishes?

Please try to replace usleep with below code:

+    // Wait for EOS message
+    GstBus *bus = gst_pipeline_get_bus(GST_PIPELINE(gst_pipeline));
+    gst_bus_poll(bus, GST_MESSAGE_EOS, GST_CLOCK_TIME_NONE);