Thanks Andrey, I tried this, but got the following result:
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0: Could not open audio device for recording.
Additional debug info:
gstalsasrc.c(744): gst_alsasrc_open (): /GstPipeline:pipeline0/GstAlsaSrc:alsasrc0:
Recording open error on device 'hw:2,0': No such file or directory
Setting pipeline to NULL ...
Freeing pipeline ...
Probably because my carrier board doesn’t have any audio device in the device tree.
Can you tell me where I can find a reference for all the pipeline gibberish? There doesn’t seem to be anything on the GStreamer site (or maybe I’m just blind).
Obviously you need to apt-get install mplayer to use that tool for playback. There are also gstreamer hardware accelerated decoder pipelines you could use instead.
Yes, I basically tried exactly that, and ended up with a 50MB file that shows up as a zero-length video in VLC.
Do you know of any way to record video that’s not MPlayer-specific? Something that can play in VLC, for example? I’d like to be able to publish videos without requiring users to install special tools…
Do you know of any way to record video that’s not MPlayer-specific?
This is not mplayer-specific.
The actual problem you’re seeing is that you’re generating a “raw H264 stream” not a “fully muxed MP4 stream.”
The internet is full of discussions about this situation, and there are various solutions, either to get media players to play them, or to re-mux them into full MP4 containers.
Or, as DaneLLL said, use qtmux to generate mp4-muxed data while recording new files. (Little known fact: MP4 is actually the QuickTime container format, standardized! Hence the name “qtmux” for “quicktime mux”)
HEre is my error message
WARNING: erroneous pipeline: could not link nvarguscamerasrc0 to omxh264enc-omxh264enc0, nvarguscamerasrc0 can’t handle caps video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1
nvarguscamerasrc -vvv will tell you info about each filter in your stream, gst-inspect of each filter in your stream is the doc for the filter, you have to control the input and output of each filter to make the filters talk.‘video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1 sets up the output from the camera, and if you inspect omxh264 you might find it does not accept NVMM, or there is something else it does not like.
Start removing things get it to work and then add things/controls back to meet your requirements
get it to work to a display. you might need a videoconvert between camera and omxh264, or use inspect to find another encoder. gst-inspect-1.0 is your friend
go to nvidia documentation and search for create video,