Enabling camera on Jetson TX1 board

Just got my TX1 development system.

I see the camera on the board, but there is no /dev/video0.

How do you enable/use the camera?

Thanks,
K

I am also curious how to enable the camera. I have the camera on the board as well but no /dev/video0 also.

I too am curious about enabling the camera.

Thanks
Rick

Im experiencing same issues I cant find any documentation on taking a picture with the TX1 and i can only get a /dev/video0 when i plug in a usb webcam

help us nvidia one konobe your our only hope!

At the time of launch the V4L2 driver wasnā€™t ready (which would make the /dev/video* node). However it will be coming in the next L4T update (R23.2). For now, you can access the camera through gstreamer and/or the nvgstcapture sample app.

(sorry for the delay, moving this post to the Jetson TX1 board.)

Regarding the nvgstcapture samples app, all i found on muy jetson board was a precompiled .so binary.

Is there a source code to be found anywhere?

We are planning to release the source to nvgstcapture/nvgstplayer in the next L4T version.

To get started today, hereā€™s an example gstreamer-1.0 pipeline using the CSI camera element ā€˜nvcameraā€™:

gst-launch-1.0 nvcamerasrc fpsRange="30.0 30.0" ! 'video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, format=(string)I420, framerate=(fraction)30/1' ! nvtee ! nvvidconv flip-method=2 ! 'video/x-raw(memory:NVMM), format=(string)I420' ! nvoverlaysink -e

edit 12/17: ā€˜video/xraw(memory:NVMM)ā€™ needed to be ā€˜video/x-raw(memory:NVMM)ā€™

I get this error message
Invalid FPSRange Input
WARNING: erroneous pipeline: could not link nvcamerasrc0 to nvtee0

Dusty - I just tried your example gstreamer pipe and got an ā€œinvalid FPSRange Inputā€ error. Iā€™ll read through some docs to see why that wouldnā€™t work, but I also tried one value instead of two and smaller values - each attempt with no success. Any quick input would be appreciated!

Hello,
Try this simple pipeline to get camera preview in screen:
gst-launch-1.0 nvcamerasrc ! ā€˜video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080,format=(string)I420, framerate=(fraction)30/1ā€™ ! nvtee ! nvoverlaysink -e

br
ChenJian

@ dusty_nv,

Is the next update for L4T (R23.2) going to be 64-bit? I know that ros Moveit! has problems with 32-bit systems and the TX1 uses a 64-bit CPU, so if you guys make a 64-bit L4T update, that would be supremely awesome.

It will be included in the release after next, with the next release coming this month.

So probably a month or two from now?

  1. Release R23.2 will be released this month?
  2. gst-launch-1.0 supports other sensors, not ov5693?
  3. What kind ?
  4. How to add support for other sensor?

When I try to add another driver sensor, GStrimer exits with an error.
If support ov5693 is hardcode , please report it.

Well, January has ended and we still havenā€™t gotten any updates regarding Jetpack L4T (R23.2) . Any updates about that one?

Sorry, the update was pushed out to include critical kernel, DVFS and perf fixes. Current ETA is late Feb/March.

You can check investigate some drivers in http://developer.download.nvidia.com/embedded/L4T/r23_Release_v1.0/source/gstomx1_src.tbz2
or for V4L2, kernel source drivers/media/platform/soc_camera

And now it is March. Any ETA on R23.2 or R23.3?

R23.2 came out last week.
https://developer.nvidia.com/embedded/linux-tegra

Ah, good! Didnā€™t see it in the embedded software downloads area.

I have flashed my TX1 to the latest OS (the R23.2) and I still cannot see the CSI-2 camera that comes with the dev kit under /dev/video0.

In fact, I canā€™t use it in any C++ OpenCV application with the suggested gstreamer method without it throwing an error saying something about appslink.

Am I missing something here? Iā€™m 99% sure its on my (the userā€™s) end.