How to capture video using gst-launch-1.0?

I’m trying to get GStreamer working on my TX2. I’m following this (slightly old) tutorial.

My TX2 has no display, so I’d like to capture images and video to disk, for viewing offline. This works to capture a JPEG image:

gst-launch-1.0 -e nvcamerasrc ! "video/x-raw(memory:NVMM), width=1920, height=1080, format=I420" ! nvjpegenc ! multifilesink location=snapshot.jpg

I tried the following to capture video, but the resulting file was basically zero-length video:

gst-launch-1.0 nvcamerasrc ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)60/1' ! nvvidconv ! 'video/x-raw(memory:NVMM), format=(string)I420' ! multifilesink location=test.mp4

I also tried this, which resulted in a larger file size, but still zero-legnth video:

gst-launch-1.0 -e nvcamerasrc ! "video/x-raw(memory:NVMM), width=1920, height=1080, format=I420" ! omxh264enc ! multifilesink location=test.mp4

What am I doing wrong?

I used to record audio+video with

$ apt-get install gstreamer1.0-plugins-bad
$ gst-launch-1.0 nvcamerasrc num-buffers=300 ! omxh264enc ! queue ! mux. alsasrc num-buffers=1000 device="hw:2,0" ! voaacenc ! queue ! qtmux name=mux ! filesink location=b.mp4

Thanks Andrey, I tried this, but got the following result:

Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Could not open audio device for recording.
Additional debug info:
gstalsasrc.c(744): gst_alsasrc_open (): /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0:
Recording open error on device 'hw:2,0': No such file or directory
Setting pipeline to NULL ...
Freeing pipeline ...

Probably because my carrier board doesn’t have any audio device in the device tree.

Can you tell me where I can find a reference for all the pipeline gibberish? There doesn’t seem to be anything on the GStreamer site (or maybe I’m just blind).

Just remove the audio part from the pipeline?

This works for me:

gst-launch-1.0 nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! omxh264enc ! 'video/x-h264, stream-format=(string)byte-stream' ! filesink location=test.h264 -e

I can play it with:

mplayer -fps 30 ./test.h264

Obviously you need to apt-get install mplayer to use that tool for playback. There are also gstreamer hardware accelerated decoder pipelines you could use instead.

Yes, I basically tried exactly that, and ended up with a 50MB file that shows up as a zero-length video in VLC.

Do you know of any way to record video that’s not MPlayer-specific? Something that can play in VLC, for example? I’d like to be able to publish videos without requiring users to install special tools…

Thanks for all your help!

Please use qtmux to get MP4 files.

gst-launch-1.0 nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! omxh264enc ! qtmux ! filesink location=test.mp4 -e

Do you know of any way to record video that’s not MPlayer-specific?

This is not mplayer-specific.

The actual problem you’re seeing is that you’re generating a “raw H264 stream” not a “fully muxed MP4 stream.”
The internet is full of discussions about this situation, and there are various solutions, either to get media players to play them, or to re-mux them into full MP4 containers.

The raw files can be played in VLC just fine: Play raw H264 files using VLC. To play raw H264 files using VLC, we… | by Pete Houston | Medium

If you already have files you want to re-mux into MP4, you can use tools such as ffmpeg to do it: ffmpeg - How to wrap H264 into a mp4 container? - Stack Overflow

Or, as DaneLLL said, use qtmux to generate mp4-muxed data while recording new files. (Little known fact: MP4 is actually the QuickTime container format, standardized! Hence the name “qtmux” for “quicktime mux”)

1 Like

@DaneLLL: This was exactly what I was looking for. Worked like a charm.

@snarky: Thanks for that, super informative! I’m trying to learn more about GStreamer pipeline options, so anything helps!

…Looks like there’s no option to mark this thread “Solved”.

Hi , i know this is marked solved but i tried it out what @DaneLLL wrote .
How can i capture MP4 from a sensor that does not have the H264 encoder ?

gst-launch-1.0 nvarguscamerasrc sensor-id=1 ! ‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1’ ! omxh264enc ! qtmux ! filesink location=test.mp4 -e "

HEre is my error message
WARNING: erroneous pipeline: could not link nvarguscamerasrc0 to omxh264enc-omxh264enc0, nvarguscamerasrc0 can’t handle caps video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1

nvarguscamerasrc -vvv will tell you info about each filter in your stream, gst-inspect of each filter in your stream is the doc for the filter, you have to control the input and output of each filter to make the filters talk.‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1 sets up the output from the camera, and if you inspect omxh264 you might find it does not accept NVMM, or there is something else it does not like.

Start removing things get it to work and then add things/controls back to meet your requirements

get it to work to a display. you might need a videoconvert between camera and omxh264, or use inspect to find another encoder. gst-inspect-1.0 is your friend

go to nvidia documentation and search for create video,

Hi,
Please take a look at
Capturing a video MP4 from Xavier AGX direct sensor (RAW)