Gstreamer uses usb camera to implement RTMP push stream

Hi,
I want to use the usb camera on the Jetson Nano, because nano does not support ffmepg for video encoding, only gstreamer can be used. How do I use it?

In addition, gst-launch-1.0 appears no such element or plugin issue, may I ask how to solve?

Thanks

My usage scenario is: gstreamer (v4l2, encoding, rtmp) for monitoring

Hi,
USB camera is a v4l2 source and you can refer to
https://devtalk.nvidia.com/default/topic/1057681/jetson-tx1/logitech-c930e-on-jetson-tx1-very-slow-and-choppy-video/post/5363417/#5363417

Reference of constructing RTMP pipeline:
https://devtalk.nvidia.com/default/topic/1068199/deepstream-sdk/does-sink-support-rtmpsink-/post/5410416/#5410416
https://devtalk.nvidia.com/default/topic/1068740/deepstream-sdk/how-to-modify-deepstream_sink_bin-c-to-support-rtmp-i-get-error-/post/5413928/#5413928

Hi,
What method do I use to link the camera with nvv4l2h264enc?

echopet@echopet-desktop:~$ gst-launch-1.0 v4l2src device=‘/dev/video0’ ! ‘video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1’ ! videoconvert ! queue ! nvv4l2h264enc ! h264parse ! flvmux ! rtmpsink location=‘rtmp://192.168.1.128/live/35’-e
WARNING: erroneous pipeline: could not link queue0 to nvv4l2h264enc0

echopet@echopet-desktop:~$ gst-launch-1.0 v4l2src device=‘/dev/video0’ ! ‘video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1’ ! videoconvert ! nvv4l2h264enc ! h264parse ! flvmux ! rtmpsink location=‘rtmp://192.168.1.128/live/35’-e
WARNING: erroneous pipeline: could not link videoconvert0 to nvv4l2h264enc0

echopet@echopet-desktop:~$ gst-launch-1.0 v4l2src device=‘/dev/video0’ ! ‘video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1’ ! nvv4l2h264enc ! h264parse ! flvmux ! rtmpsink location=‘rtmp://192.168.1.128/live/35’-e
WARNING: erroneous pipeline: could not link v4l2src0 to nvv4l2h264enc0, nvv4l2h264enc0 can’t handle caps video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1

Device found:

name  : RMONCAM FHD 1080P
class : Video/Source
caps  : video/x-raw, format=(string)YUY2, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)5/1;
        video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)8/1;
        video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)10/1;
        video/x-raw, format=(string)YUY2, width=(int)960, height=(int)540, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)20/1;
        video/x-raw, format=(string)YUY2, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)20/1;
        video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)1280, height=(int)960, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)1024, height=(int)768, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)800, height=(int)600, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)352, height=(int)288, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)320, height=(int)240, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
        image/jpeg, width=(int)160, height=(int)120, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1;
properties:
	udev-probed = true
	device.bus_path = platform-70090000.xusb-usb-0:2.2:1.0
	sysfs.path = /sys/devices/70090000.xusb/usb1/1-2/1-2.2/1-2.2:1.0/video4linux/video0
	device.bus = usb
	device.subsystem = video4linux
	device.vendor.id = 0202
	device.vendor.name = Generic
	device.product.id = 0201
	device.product.name = "RMONCAM\ FHD\ 1080P"
	device.serial = Generic_RMONCAM_FHD_1080P_200901010001
	device.capabilities = :capture:
	device.api = v4l2
	device.path = /dev/video0
	v4l2.device.driver = uvcvideo
	v4l2.device.card = "RMONCAM\ FHD\ 1080P"
	v4l2.device.bus_info = usb-70090000.xusb-2.2
	v4l2.device.version = 264588 (0x0004098c)
	v4l2.device.capabilities = 2216689665 (0x84200001)
	v4l2.device.device_caps = 69206017 (0x04200001)
gst-launch-1.0 v4l2src ! ...

Hi,
Please replace videoconvert with nvvidconv. The input to hardware encoder is in NVMM buffer and nvvidconv can be utilized for conversion. Please check gstreamer user guide for more examples:
https://developer.nvidia.com/embedded/dlc/l4t-accelerated-gstreamer-guide-32-2

Thanks.
I have successfully pushed the stream to the RTMP server. How do I check the GPU usage and GPU encoding performance?

gst-launch-1.0 v4l2src device=‘/dev/video0’ ! ‘video/x-raw, format=(string)YUY2, width=(int)640, height=(int)480, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1’ ! nvvidconv ! nvv4l2h264enc ! h264parse ! flvmux ! rtmpsink location=‘rtmp://192.168.1.128:1935/live/35’

Hi,
For checking GPU usage, you can run tegrastats:
https://docs.nvidia.com/jetson/l4t/index.html#page/Tegra%2520Linux%2520Driver%2520Package%2520Development%2520Guide%2FAppendixTegraStats.html%23

One clarification is the encoding is not done on GPU. There is an independent hardware engine NVENC.

How do I judge the performance of video encoding NVENC? At present I have a slow RTMP streaming rate. I’m not sure if it is the encoding or the streaming reason.
nvv4l2h264enc Property: What is the calculation method of iframeinterval?

Hi,
You may run ‘sudo jetson_clocks’ to run CPU in max clocks.

tegrastats also shows clock frequency of NVENC. You may enable ‘maxperf-enable’ in nvv4l2h264enc and check the clock.

> maxperf-enable      : Enable or Disable Max Performance mode
                        flags: readable, writable, changeable only in NULL or READY state
                        Boolean. Default: false

You can also try ‘io-mode=2’ in v4l2src.

It seems that this is not the reason, I see the input format in nvv4l2h264enc: format: {(string) I420, (string) NV12}, but my camera output is YUYV422 format, what can I do in this case?

Camera:
support device 1.Motion-JPEG
support device 2.YUYV 4: 2: 2

Can I connect a videoconvert before nvvidconv? How do I pick this up?
gst-launch-1.0 v4l2src device = ‘/ dev / video0’! 'video / x-raw, format = (string) YUY2, width = (int) 640, height = (int) 480, pixel-aspect-ratio = ( fraction) 1/1, framerate = (fraction) 30/1 ‘! nvvidconv! nvv4l2h264enc! h264parse! flvmux! rtmpsink location =’ rtmp: //192.168.1.128: 1935 / live / 35 ’

Hi,
You don’t need videoconvert plugin. The nvvidconv plugin is the hardware converter doing YUYV to NV12 conversion.
For checking system loading, you can run sudo tegrastats.