How to run RTP Camera in deepstream on Nano

I am test deepstream in nano, now I meet RTSP problem ,anybody can help me?
I have a HIKVISON camera, I want to use it in deepstream, but it is fail, the configuration is

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
camera-width=1920
camera-height=1080
camera-fps-n=30
camera-v4l2-dev-node=0
#uri=file://…/…/samples/streams/sample_1080p_h264.mp4
uri=‘rtsp://admin:123456@10.110.128.146:554/h264/main/ch1/main/av_stream latency=2 ! rtph264depay ! h264parse ! omxh264dec ! nvvidconv ! video/x-raw, width=(int)1920, height=(int)1080, format=(string)BGRx ! videoconvert ! appsink’

error:
ERROR from src_elem0: Resource not found.
Debug info: gstrtspsrc.c(7456): gst_rtspsrc_retrieve_sdp (): /GstPipeline:pipeline/GstBin:multi_src_bin/GstBin:src_sub_bin0/GstRTSPSrc:src_elem0:
No valid RTSP URL was provided

I also test in follow configuration, same problem
uri=rtsp://admin:123456@10.110.128.146:554/Streaming/channels/1

I test URL in opencv, it is ok.
Can you tell me how to set the RTSP camera.

thanks.

Hi,

The first thing you should check is if the simplest RTSP test case works or not.

Try this:

$ gst-launch-1.0 rtspsrc location=RTSP_URL ! rtph264depay ! queue ! h264parse ! nvv4l2decoder ! nvvideoconvert ! "video/x-raw(memory:NVMM),format=RGBA" ! nvegltransform ! nveglglessink sync=False

Hi,
I confirm it is ok with your command in gst-launch-1.0
but it failed in deepstrem with same command in config file.

Hi,
Your uri should be

uri=rtsp://admin:123456@10.110.128.146:554/h264/main/ch1/main/av_stream

deepstream-app -c deepstream_app_config_yoloV3_tiny.txt

(deepstream-app:9513): GStreamer-CRITICAL **: 15:42:14.642: passed ‘0’ as denominator for `GstFraction’

(deepstream-app:9513): GStreamer-WARNING **: 15:42:14.645: Name ‘src_cap_filter’ is not unique in bin ‘src_sub_bin1’, not adding
Creating LL OSD context new
Deserialize yoloLayerV3 plugin: yolo_17
Deserialize yoloLayerV3 plugin: yolo_24

Runtime commands:
h: Print this help
q: Quit

    p: Pause
    r: Resume

NOTE: To expand a source in the 2D tiled display and view object details, left-click on the source.
To go back to the tiled display, right-click anywhere on the window.

** INFO: <bus_callback:163>: Pipeline ready

Creating LL OSD context new
Opening in BLOCKING MODE
NvMMLiteOpen : Block : BlockType = 261
NVMEDIA: Reading vendor.tegra.display-size : status: 6
NvMMLiteBlockCreate : Block : BlockType = 261
** INFO: <bus_callback:149>: Pipeline running

ERROR from tiled_display_tiler: GstNvTiler: FATAL ERROR; NvTiler::Composite failed
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvtiler/gstnvtiler.cpp(665): gst_nvmultistreamtiler_transform (): /GstPipeline:pipeline/GstBin:tiled_display_bin/GstNvMultiStreamTiler:tiled_display_tiler
Quitting
0:00:04.961899604 9513 0x198c8990 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary_gie_classifier> error: Internal data stream error.
0:00:04.962003199 9513 0x198c8990 WARN nvinfer gstnvinfer.cpp:1830:gst_nvinfer_output_loop:<primary_gie_classifier> error: streaming stopped, reason error (-5)
ERROR from primary_gie_classifier: Internal data stream error.
Debug info: /dvs/git/dirty/git-master_linux/deepstream/sdk/src/gst-plugins/gst-nvinfer/gstnvinfer.cpp(1830): gst_nvinfer_output_loop (): /GstPipeline:pipeline/GstBin:primary_gie_bin/GstNvInfer:primary_gie_classifier:
streaming stopped, reason error (-5)
App run failed

I do it as you said, it fail yet. “ERROR from tiled_display_tiler: GstNvTiler: FATAL ERROR; NvTiler::Composite failed”

Hi Mike,
PLease try to modify config_infer_primary_nano.txt. Don’t see deepstream_app_config_yoloV3_tiny.txt in DeepStream SDK 4.0. It probably is not the config file following the rule.

as hunterjm’comment in another posts, the rtsp was open, to disable [titled-display] and type=2 [EglSink].
but it can only open one stream

I tried but it doesn’t work

Hi,
The following steps are verified on DS4.0-Jetson Nano. Please refer to it and adapt to your usecase.

1 Start rtsp server:

sudo apt-get install libgstrtspserver-1.0 libgstreamer1.0-dev
gcc test-launch.c -o test-launch $(pkg-config --cflags --libs gstreamer-1.0 gstreamer-rtsp-server-1.0)
[test-launch.c download from: https://github.com/GStreamer/gst-rtsp-server/blob/master/examples/test-launch.c]
./test-launch "filesrc location=sample_1080p_h264.mp4 ! qtdemux ! rtph264pay name=pay0 pt=96 "

2 Modify source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_nano.txt

[tiled-display]
[b]rows=1
columns=1[/b]

[source0]
#Type - 1=CameraV4L2 2=URI 3=MultiURI  4=RTSP
<b>type=4</b>
#uri=file://../../streams/sample_720p.mp4
[b]uri=rtsp://127.0.0.1:8554/test
num-sources=1[/b]

[streammux]
<b>batch-size=1</b>

[primary-gie]
<b>batch-size=1</b>

3 Run

$ sudo jetson_clocks
$ deepstream-app -c source8_1080p_dec_infer-resnet_tracker_tiled_display_fp16_nano.txt

It still does not work…

Hi,
There is an issue about Hikvision IP camera:
[url][Jetson Nano] deepstream-app crash after IR cut filter on IP camera - DeepStream SDK - NVIDIA Developer Forums
Probably your camera is from HikVision?

Yes, HikVision and Dahua.

But with “gst-launch-1.0” works.

So what should I do to start the stream RTSP in deepstream-app?

Thanks

yes, our carmera is from HikVision

I had the same issue with my wansview ip camera but manage to fix it by disabling tiled-display, tracker, and by putting the correct stream width and height in [streammux] options.

You’ll find my config file in this thread, maybe it will work for you.

Hello,

We have the same problem on HikVision cameras when we want to use the tiler (“ERROR from tiled_display_tiler: GstNvTiler: FATAL ERROR; NvTiler::Composite failed”), we have tested two different models:

Working OK
Model: DS-2CD2510F
Firmware Version V5.3.0 build 150327
Encoding Version V1.0 build 150327

Not working
Model: DS-2CD2142FWD-I
Firmware Version V5.4.4 build 161125
Encoding Version V7.3 build 161122

We have also tested AVIGILON 12W-H3-4MH-DP1-B cameras and the result is the same.

A strange behavior is that tiler works with the segmentation pipeline, provided we use nvsegvisual.
We get the same error if we remove nvsegvisual from the pipeline.

Any helps is welcomed.

Tiled display doesn’t work for me either, and ouput generally only works when OSD is enabled.

edit: using an Axis P1428-e camera and AGX Xavier
Might have something to do with resolution: when I set the camera’s output to 1920x1080 instead of 1280x720 there is no error, although still a black screen (0 FPS).

It works! Thanks

Is possible use multi RTSP sources?

Hi andrea_vighi,

Please share your config file so that other users can refer to it.
For multi RTSP sources, how many sources are run in your usecase?

My config file for RTSP !

# Copyright (c) 2019 NVIDIA Corporation.  All rights reserved.
#
# NVIDIA Corporation and its licensors retain all intellectual property
# and proprietary rights in and to this software, related documentation
# and any modifications thereto.  Any use, reproduction, disclosure or
# distribution of this software and related documentation without an express
# license agreement from NVIDIA Corporation is strictly prohibited.

[application]
enable-perf-measurement=1
perf-measurement-interval-sec=5
#gie-kitti-output-dir=streamscl

[tiled-display]
enable=0
rows=1
columns=1
width=1920
height=1080
gpu-id=0
#(0): nvbuf-mem-default - Default memory allocated, specific to particular platform
#(1): nvbuf-mem-cuda-pinned - Allocate Pinned/Host cuda memory applicable for Tesla
#(2): nvbuf-mem-cuda-device - Allocate Device cuda memory applicable for Tesla
#(3): nvbuf-mem-cuda-unified - Allocate Unified cuda memory applicable for Tesla
#(4): nvbuf-mem-surface-array - Allocate Surface Array memory, applicable for Jetson
nvbuf-memory-type=0

[source0]
enable=1
#Type - 1=CameraV4L2 2=URI 3=MultiURI 4=RTSP
type=4
#uri=file://../../streams/sample_1080p_h264.mp4
uri=rtsp://192.168.30.10:554/live/ch0
num-sources=1
#drop-frame-interval=2
gpu-id=0
# (0): memtype_device   - Memory type Device
# (1): memtype_pinned   - Memory type Host Pinned
# (2): memtype_unified  - Memory type Unified
cudadec-memtype=0

[sink0]
enable=1
#Type - 1=FakeSink 2=EglSink 3=File
type=2
sync=0
source-id=0
gpu-id=0
nvbuf-memory-type=0

[sink1]
enable=0
type=3
#1=mp4 2=mkv
container=1
#1=h264 2=h265
codec=1
sync=0
#iframeinterval=10
bitrate=2000000
output-file=out.mp4
source-id=0

[sink2]
enable=0
#Type - 1=FakeSink 2=EglSink 3=File 4=RTSPStreaming
type=4
#1=h264 2=h265
codec=1
sync=0
bitrate=4000000
# set below properties in case of RTSPStreaming
rtsp-port=8554
udp-port=5400

[osd]
enable=1
gpu-id=0
border-width=1
text-size=15
text-color=1;1;1;1;
text-bg-color=0.3;0.3;0.3;1
font=Serif
show-clock=0
clock-x-offset=800
clock-y-offset=820
clock-text-size=12
clock-color=1;0;0;0
nvbuf-memory-type=0

[streammux]
gpu-id=0
##Boolean property to inform muxer that sources are live
live-source=1
batch-size=1
##time out in usec, to wait after the first buffer is available
##to push the batch even if the complete batch is not formed
batched-push-timeout=40000
## Set muxer output width and height
width=1920
height=1080
##Enable to maintain aspect ratio wrt source, and allow black borders, works
##along with width, height properties
enable-padding=0
nvbuf-memory-type=0

# config-file property is mandatory for any gie section.
# Other properties are optional and if set will override the properties set in
# the infer config file.
[primary-gie]
enable=1
gpu-id=0
model-engine-file=../../models/Primary_Detector_Nano/resnet10.caffemodel_b8_fp16.engine
batch-size=1
#Required by the app for OSD, not a plugin property
bbox-border-color0=1;0;0;1
bbox-border-color1=0;1;1;1
bbox-border-color2=0;0;1;1
bbox-border-color3=0;1;0;1
interval=0
gie-unique-id=1
nvbuf-memory-type=0
config-file=config_infer_primary.txt

[tracker]
enable=0
tracker-width=640
tracker-height=368
#ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_iou.so
#ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_nvdcf.so
ll-lib-file=/opt/nvidia/deepstream/deepstream-4.0/lib/libnvds_mot_klt.so
#ll-config-file required for DCF/IOU only
#ll-config-file=tracker_config.yml
#ll-config-file=iou_config.txt
gpu-id=0
#enable-batch-process applicable to DCF only
enable-batch-process=1

[secondary-gie0]
enable=0
model-engine-file=../../models/Secondary_VehicleTypes/resnet18.caffemodel_b16_int8.engine
gpu-id=0
batch-size=16
gie-unique-id=4
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_vehicletypes.txt

[secondary-gie1]
enable=0
model-engine-file=../../models/Secondary_CarColor/resnet18.caffemodel_b16_int8.engine
batch-size=16
gpu-id=0
gie-unique-id=5
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_carcolor.txt

[secondary-gie2]
enable=0
model-engine-file=../../models/Secondary_CarMake/resnet18.caffemodel_b16_int8.engine
batch-size=16
gpu-id=0
gie-unique-id=6
operate-on-gie-id=1
operate-on-class-ids=0;
config-file=config_infer_secondary_carmake.txt

[tests]
file-loop=0

I would like to know if it is possible to start only one deepstream-app process and set, in the config file, multiple RTSP sources (as done with “type = 3” [multi uri] .mp4 files).
Currently I have launched multiple deepstream-app processes to have multiple RTSP sources.

Thanks

1 Like