Python: Working with images in memory instead of files

So I have played a bit with the python examples (like the detectnet-console.py) and they are working ok I think

I have an additional question: I would like to use image already loaded into memory instead of going via file system

I have an image already in memory via mqtt (sent as bytes) and processed by opencv

# convert string of image data to uint8
nparr = np.fromstring(msg.payload, np.uint8)
# decode image
image = cv2.imdecode(nparr, cv2.IMREAD_COLOR)
image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)

But instead of using it directly, I have to save to disk first, then load it again and then do detections

cv2.imwrite(cnbr+'.jpg', image)
# load an image (into shared CPU/GPU memory)
img, width, height = jetson.utils.loadImageRGBA(cnbr+'.jpg')
detections = net.Detect(img, width, height, "box,labels,conf")

This works but seems unnecessary since I already have the image in memory to start. How can I do this directly? I tried with cudaFromNumpy(…) but I could not make it work, maybe I had the wrong format of the numpy ndarray

Finally I would like to create a resulting image in same format as original but with the detections added. Currently using

jetson.utils.saveImageRGBA("myoutput.jpg", img, width, height)

but this goes to file

Hi

There is a cudaFromNumpy() function in jetson.utils module that can be used in this case.
Please refer to below dev talk link for more detail:
https://devtalk.nvidia.com/default/topic/1063558/getting-image-bits-to-gpu-for-inference-detectnet-/?offset=7

You can try using cudaToNumpy to convert the detection image back to np array.
https://github.com/dusty-nv/jetson-utils/blob/798c416c175d509571859c9290257bd5cce1fd63/python/examples/cuda-to-numpy.py

Jetson inference examples:
https://github.com/dusty-nv/jetson-inference/tree/master/python/examples

Thanks

Thank you!!! Tried that and it works brilliant, very good!