Can't get GStreamer RTSP working on TX2

I can’t seem to get Gstreamer to load a RTSP stream from an IP camera on the TX2 running L4T 28.2. I have verified the camera works with VLC 2.2.5.1. GStreamer version is 1.8.3. The pipelines I have tried are with playbin and rtspsrc. Here are the respective terminal outputs:

tegra@tegra-ubuntu:~$ gst-launch-1.0 playbin uri=rtsp://192.168.2.119/554
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.2.119/554/
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request

It pauses here for about three minutes. Looking at the system monitor, it gradually uses more CPU

and memory until reaching about 105% on CPU, then giving the following error ####
(gst-launch-1.0:2645): GStreamer-WARNING **: failed to create thread: Error creating thread: Resource
temporarily unavailable

nvidia@tegra-ubuntu:~$ gst-launch-1.0 rtspsrc location=rtsp://192.168.2.119/554 latency=0 ! decodebin ! nvvidconv ! videoconvert ! xvimagesink sync=false
Setting pipeline to PAUSED …
Pipeline is live and does not need PREROLL …
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.2.119/554
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request

It pauses here for about three minutes. Looking at the system monitor, it gradually uses more CPU

and memory until reaching about 105% on CPU, then giving the following error ####
ERROR: from element /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1: Internal data flow error.
Additional debug info:
gstbasesrc.c(2948): gst_base_src_loop (): /GstPipeline:pipeline0/GstRTSPSrc:rtspsrc0/GstUDPSrc:udpsrc1:
streaming task paused, reason not-linked (-1)
Execution ended after 0:00:00.213841344
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …
nvidia@tegra-ubuntu:~$

tegra@tegra-ubuntu:~$ gst-launch-1.0 rtspsrc location=rtsp://192.168.2.119/554/h264 ! rtph264depay !
h264parse ! omxh264dec ! nveglglessink
Setting pipeline to PAUSED …
Using winsys: x11
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Progress: (open) Opening Stream
Progress: (connect) Connecting to rtsp://192.168.2.119/554/h264
Progress: (open) Retrieving server options
Progress: (open) Retrieving media info
Progress: (request) SETUP stream 0
Progress: (open) Opened Stream
Setting pipeline to PLAYING …
New clock: GstSystemClock
Progress: (request) Sending PLAY request
Progress: (request) Sending PLAY request
Progress: (request) Sent PLAY request

It pauses here for about three minutes. Looking at the system monitor, it gradually uses more CPU

and memory until reaching about 105% on CPU, then giving the following error ####
(gst-launch-1.0:2461): GStreamer-WARNING **: failed to create thread: Error creating thread: Resource
temporarily unavailable

Hi,
Please refer to [url]https://devtalk.nvidia.com/default/topic/1018689/jetson-tx2/vlc-playing-gstreamer-flow/post/5187270/#5187270[/url]

Here are my errors when I try to run that on VLC:

nvidia@tegra-ubuntu:~/Documents$ vlc rtsp://127.0.0.1:8554/test
VLC media player 2.2.5.1 Umbrella (revision 2.2.5.1-14-g05b653355c)
[0000000000415160] core libvlc: Running vlc with the default interface. Use ‘cvlc’ to use vlc without interface.
[0000000000528b60] [cli] lua interface: Listening on host “*console”.
VLC media player 2.2.5.1 Umbrella
Command Line Interface initialized. Type `help’ for help.

[0000007f80008ee0] vdpau_avcodec generic error: unsupported codec 1211250229 or profile 1
Failed to open VDPAU backend libvdpau_nvidia.so: cannot open shared object file: No such file or directory
[0000007f9c0009c0] core input error: ES_OUT_RESET_PCR called
[hevc @ 0x7f8c059970] Could not find ref with POC 14

At this point, the VLC window opens up but displays the colorful blocky vertical lines that TVs back in the day would display when there was no video. The terminal with the test-launch script itself has no errors and says “stream ready at rtsp://…”.

To be clear about my intention with this, I’m ultimately trying to feed this IP video input into OpenCV to run object detection on the TX2. I initially used this as a guide: How to Capture and Display Camera Video with Python on Jetson TX2. But it didn’t work, I have been trying everything I can think of to troubleshoot it and have reflashed many times. Currently, I have a clean install flashed from JetPack 3.2 with the exception of VLC 2.2.5.1 built from source.

You may also refer to
[url]https://devtalk.nvidia.com/default/topic/1014789/jetson-tx1/-the-cpu-usage-cannot-down-use-cuda-decode-/post/5188538/#5188538[/url]
[url]https://devtalk.nvidia.com/default/topic/1027423/jetson-tx2/gstreamer-issue-on-tx2/post/5225972/#5225972[/url]

Neither worked, see below for output from both attempts:

From https://devtalk.nvidia.com/default/topic/1014789/jetson-tx1/-the-cpu-usage-cannot-down-use-cuda-decode-/2

Attempt to compile test-mp4.c:

nvidia@tegra-ubuntu:/usr/include/gstreamer-1.0/gst$ sudo gcc test-mp4.c -o test-mp4
[sudo] password for nvidia:
test-mp4.c:20:21: fatal error: gst/gst.h: No such file or directory
compilation terminated.
nvidia@tegra-ubuntu:/usr/include/gstreamer-1.0/gst$

After compile failure, checked to verify libgstreamer1.0-dev is installed, it was. Tried compile again after and it failed. This is odd because the previous test-launch.c (https://devtalk.nvidia.com/default/topic/1018689/jetson-tx2/vlc-playing-gstreamer-flow/post/5187270/#5187270) script compiled properly, even with both scripts in the same folder.

nvidia@tegra-ubuntu:/usr/include/gstreamer-1.0/gst$ sudo apt-get install libgstreamer1.0-dev
Reading package lists… Done
Building dependency tree
Reading state information… Done
libgstreamer1.0-dev is already the newest version (1.8.3-1~ubuntu0.1).
0 upgraded, 0 newly installed, 0 to remove and 293 not upgraded.


From https://devtalk.nvidia.com/default/topic/1027423/jetson-tx2/gstreamer-issue-on-tx2/post/5225972/#5225972

On server terminal:

nvidia@tegra-ubuntu:~/Documents$ gst-launch-1.0 videotestsrc ! nvvidconv ! omxh264enc ! ‘video/x-h264,stream-format=byte-stream’ ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=5000
Setting pipeline to PAUSED …
Pipeline is PREROLLING …
Framerate set to : 30 at NvxVideoEncoderSetParameterNvMMLiteOpen : Block : BlockType = 4
===== MSENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
===== MSENC blits (mode: 1) into tiled surfaces =====
Pipeline is PREROLLED …
Setting pipeline to PLAYING …
New clock: GstSystemClock
^Chandling interrupt.
Interrupt: Stopping pipeline …
Execution ended after 0:03:47.716345890
Setting pipeline to PAUSED …
Setting pipeline to READY …
Setting pipeline to NULL …
Freeing pipeline …


On client terminal:

nvidia@tegra-ubuntu:~$ export DISPLAY=:0
nvidia@tegra-ubuntu:~$ gst-launch-1.0 udpsrc port=5000 ! ‘application/x-rtp,encoding-name=H264,payload=96’ ! tee name=t t. ! queue ! filesink location= test.mpg t. ! queue ! rtph264depay ! h264parse ! omxh264dec ! nveglglessink
Setting pipeline to PAUSED …

Using winsys: x11
Pipeline is live and does not need PREROLL …
Got context from element ‘eglglessink0’: gst.egl.EGLDisplay=context, display=(GstEGLDisplay)NULL;
Setting pipeline to PLAYING …
New clock: GstSystemClock
NvMMLiteOpen : Block : BlockType = 261
TVMR: NvMMLiteTVMRDecBlockOpen: 7647: NvMMLiteBlockOpen
NvMMLiteBlockCreate : Block : BlockType = 261
TVMR: cbBeginSequence: 1179: BeginSequence 320x240, bVPR = 0
TVMR: LowCorner Frequency = 100000
TVMR: cbBeginSequence: 1529: DecodeBuffers = 17, pnvsi->eCodec = 4, codec = 0
TVMR: cbBeginSequence: 1600: Display Resolution : (320x240)
TVMR: cbBeginSequence: 1601: Display Aspect Ratio : (320x240)
TVMR: cbBeginSequence: 1669: ColorFormat : 5
TVMR: cbBeginSequence:1683 ColorSpace = NvColorSpace_YCbCr601
TVMR: cbBeginSequence: 1809: SurfaceLayout = 3
TVMR: cbBeginSequence: 1902: NumOfSurfaces = 24, InteraceStream = 0, InterlaceEnabled = 0, bSecure = 0, MVC = 0 Semiplanar = 1, bReinit = 1, BitDepthForSurface = 8 LumaBitDepth = 8, ChromaBitDepth = 8, ChromaFormat = 5
TVMR: cbBeginSequence: 1904: BeginSequence ColorPrimaries = 2, TransferCharacteristics = 2, MatrixCoefficients = 2
Allocating new output: 320x240 (x 24), ThumbnailMode = 0
OPENMAX: HandleNewStreamFormat: 3464: Send OMX_EventPortSettingsChanged : nFrameWidth = 320, nFrameHeight = 240
TVMR: FrameRate = 13333
TVMR: NVDEC LowCorner Freq = (576000 * 1024)
—> TVMR: Video-conferencing detected !!!
TVMR: FrameRate = 29.991482
TVMR: FrameRate = 30.001920
TVMR: FrameRate = 30.001470
TVMR: FrameRate = 29.999670
TVMR: FrameRate = 30.001380
TVMR: FrameRate = 29.998320
TVMR: FrameRate = 30.001650
TVMR: FrameRate = 29.998770

Same video output pops up as last post with the old TV style colorful, blocky, vertical lines.

Looks like it is successfully run. You can compare with

$ gst-launch-1.0 videotestsrc ! nvoverlaysink

In my case, when I execute your steps above they both return the tv pallete with white noise.

However, I am rather concerned in the steps,
as I am interested to test network broadcast and found steps at the forum.
Upd: though it appears that you are using IP camera, while I tested with CSI MIPI

DaneLLL,

$ gst-launch-1.0 videotestsrc ! nvoverlaysink

Displays the same tv pallete as before. How do I get it to display the live video stream from the IP camera?

If you are using an IP camera it should have some web service I guess.
What model of the IP camera do you use?
As far as I am concerned an IP camera has a dedicated IP address and does cast somehow persistently. It won’t require a call from Jetson to start webcast. Moreover IP cameras may have some API.
On the other hand if you connect a camera to Jetson, and start broadcast with rtsp - it will rather utilize ip address of Jetson, and to some extent will represent a form of somewhat IP camera.
Could you clarify what is the interface you are using to connect camera with Jetson? Is it Ethernet? TCP-IP wires? WIFI? USB? CSI?

That’s my point, I have the RTSP address, it works successfully with VLC, but I can’t get it to work with GStreamer.

It appears to me that the camera is not connected with jetson by other means than the TCP/IP network.
As far as I am concerned, you are approaching to control camera with gstreamer somehow.

You may refer to gstreamer forums, especially to discussions related to ip camera use, in my opinion.
For example: http://gstreamer-devel.966125.n4.nabble.com/Streaming-directly-from-an-IP-camera-td4683357.html

It appears that if you change / to : in “192.168.2.119/554” that will be port, instead of folder.

The interface with the IP camera is ethernet. I’m trying to get this guide to work: How to Capture and Display Camera Video with Python on Jetson TX2. My ultimate goal is to get the live feed into OpenCV to run object detection. I’ve posted on the gstreamer forums previously and only got one reply which was to remove anything nVidia related from the TX2.

I’ve tried both “/” and “:”, they both give the same result.

Thank you for sharing the link.
I used to get usb camera with python-opencv running.
However, I did not approach IP camera with it.

What if you run the gstreamer sequence from a regular PC?
Is it different with errors?

I actually haven’t tried it with a normal pc. I’ll give that a try next.

Please check if below thread helps:
https://devtalk.nvidia.com/default/topic/1004914/