Example notebooks for "Getting started with AI on Jetson Nano"

Hi!

I’ve just enrolled myself into the course and I’d like to follow the examples mentioned (e.g. classification_interactive.ipynb, usb_camera.ipynb, etc.). As I have already a Jetson Nano, which is set up with the Jetson Nano Developer Kit SD Card Image instead of the DLI Jetson Nano SD Card Image, I do not have them. Is it possible to download only the examples of the course? I’m not so much in favor of either reseting my Nano or using a second SD card and copy over the files…I am aware that I have to install the necessary components by myself.

Hi svenw, the SD card image from the DLI course is the installation method of the notebooks, as there are a number of required dependencies that come pre-installed which aren’t included in the base image and that aren’t trivial to install standalone.

If you don’t wish to use a separate image, you could check out the updates to Hello AI World, which can be installed on top:
[url]https://devtalk.nvidia.com/default/topic/1057006/jetson-nano/hello-ai-world-now-supports-python-and-onboard-training-with-pytorch-/[/url]

Great, I’ll have a look at the examples. Thanks a lot!

I am using the “SparkFun JetBot AI Kit Powered by NVIDIA Jetson Nano” (SparkFun JetBot AI Kit v3.0 Powered by Jetson Nano - KIT-18486 - SparkFun Electronics) which comes with a pre-flashed MicroSD card. For my case, adding the example code and missing dependencies is more instructive. One option is to download the Deep Learning Institute (DLI) Jetson Nano SD Card Image that is linked to from the course. Then unzip it and mount the .img file:

jetbot@jetbot:~/Downloads$ sudo mount -o loop,offset=12582912 dlinano_v1-0-0_image_20GB.img ~/Downloads/img/

Note that the mount point must exist, an empty ~/Downloads/img directory in my case. I used parted to find the offset value within the .img file. The example code is only 1.1 MB and can be found in the nvdli-nano sub-directory of the .img’s /home/dlinano.

I did have to clone and install JetCam (GitHub - NVIDIA-AI-IOT/jetcam: Easy to use Python camera interface for NVIDIA Jetson). With JetCam in place, at least the camera notebook works without any problems. However, I have not yet worked through the remainder of the course.

1 Like

Kind of ridiculous that we have to jump through these kinds of hoops as beginners to get the correct notebooks in order to work through the NVidia AI course. The courses look useful at first glance, but the documentation desires much to be considered an efficient way to get up and running in these technologies.
It’s taken me a few days and flashing a few different SD images to finally get at these notebooks which are referenced in this course.
https://courses.nvidia.com/courses/course-v1:DLI+C-RX-02+V1/courseware/b2e02e999d9247eb8e33e893ca052206/26aa9f8bdaa948d9b068a8275c89e546/?child=first

1 Like

This tech is going to be obsolete by the time I can get a decent copy flashed to an SD card.

2 Likes

I agree with what onlinexdjw1 says. I followed a couple of YouTube’rs like jetsonhacks and Paul McWhorter (TopTechBoy) so I could get the nano up and running. I was able to get my headless setup and then remote Jupyter Lab.

Then I went to the getting started course and found that I should have flashed the dlinano image which means that all the configs I setup on my 128G sdcard might need to be scrapped.

Having two sdcard images seems dumb. There is no simple post or article explaining what the difference between the two images are. Jetpack, dlinano etc. That would help us decide which to use. I don’t want two. That will complicate things.

So why does jetpack not include the courseware and an integrated set of dependencies so I could at least run the starting notebooks? A single image that everyone can get going with?

There is a tower of babel of threads and posts all over this forum and stackoverflow that makes this seem really knotted up as a developer learning path.

Nvidia: How about saving data science people some heartache and get all of this ported to Anaconda? That would save us a lot of wasted time and effort.

2 Likes

@nvidia: it would be nice if the dlinano files/packages were included in the online apt repo. It would alleviate many problems related to locale, SD card size, password confusion, and security.

Update:

I installed Dusty_nv’s inference package and it has a lot of good stuff to get going with object-rec and other DL stuff.
By the way, the inference install works flawlessly and everything sets up correctly. You can run all the Python demos on the command line with no problems.

Now, I boot only the distribution sd-card image even after I made a separate dlinano image.
I have run through all the dlinano image stuff already.

Where can I find the dlinano jupyter notebook sources?
I want to replicate those on my standard distribution.

Great stuff, Dustin.

Thanks!

Hello Nvidia? Bought your device because it looked so easy, it’s a damn NIGHTMARE! been at it for 2 days straight with sd images/gitclones problem after problem with trying to figure out howe to get it all running, not to mension to the lack of video1 in the dev folder with your jetcard sd image, what’s with that? A driver problem on a board you made TWO csi ports for? I’m so damn confused by the coursework due to lack of updating the content on your sites as you change images and remove jupyter lab from other updated versions, Christ, I don’t want to mess with files and folders, I just want to code with the assets you apparently had in the course work, BUT NOPE, it’s not there! classification_interactive.ipynb WHERE ART THOU!???

Just give us an updated image with everything ready for jupyterlab!

Hi @Asmodev, please see my reply to your other post here:

I think there is some confusion that you are using JetCard image to follow the AI Fundamentals course - whereas just the standard NVIDIA JetPack SD card image should be used. And then each tutorial from the course has a container that is downloaded that contains everything for that tutorial.

Dusty,

Is there instructions to setup the headless Jupyter Lab like the one installed inside of the DLI containers on a new venv on the nano without using the containers ?

Hi @shirvonp, you should basically be able to follow the same steps that I run in the Dockerfile for l4t-ml to install JupyterLab:

https://github.com/dusty-nv/jetson-containers/blob/6333e058495e689fcceebb6bc04eb72b5ab43893/Dockerfile.ml#L153

Basically run the commands that I do in the RUN statements. For the apt commands, run those with sudo.