How can I see GPU memory used ?

Hi,

I want to monitor GPU memory used in TX2. But the tegrastats only show RAM , and the top only can see the CPU memory. Does it correct to calculate it by GPU MEMORY USED = RAM - (RES - SHR) ?

Check with cudaMemGetInfo().

Here is a sample code:

https://devtalk.nvidia.com/default/topic/974063/jetson-tx1/caffe-failed-with-py-faster-rcnn-demo-py-on-tx1/post/5013784/#5013784

Hi, ShaneCCC

when I used this demo, the GPU used show below:

GPU memory usage: used = 5258.76, free = 2591.95 MB, total = 7850.71 MB
GPU memory usage: used = 5260.53, free = 2590.18 MB, total = 7850.71 MB
GPU memory usage: used = 5260.52, free = 2590.19 MB, total = 7850.71 MB
GPU memory usage: used = 5261.14, free = 2589.57 MB, total = 7850.71 MB
GPU memory usage: used = 5261.41, free = 2589.3 MB, total = 7850.71 MB
GPU memory usage: used = 5261.41, free = 2589.3 MB, total = 7850.71 MB
GPU memory usage: used = 5261.41, free = 2589.3 MB, total = 7850.71 MB
GPU memory usage: used = 5261.37, free = 2589.34 MB, total = 7850.71 MB
GPU memory usage: used = 5261.43, free = 2589.28 MB, total = 7850.71 MB
GPU memory usage: used = 5261.62, free = 2589.09 MB, total = 7850.71 MB
GPU memory usage: used = 5261.62, free = 2589.09 MB, total = 7850.71 MB
GPU memory usage: used = 5262.12, free = 2588.59 MB, total = 7850.71 MB
GPU memory usage: used = 5263.9, free = 2586.8 MB, total = 7850.71 MB
GPU memory usage: used = 5263.93, free = 2586.77 MB, total = 7850.71 MB
GPU memory usage: used = 5263.98, free = 2586.73 MB, total = 7850.71 MB
GPU memory usage: used = 5263.98, free = 2586.73 MB, total = 7850.71 MB
GPU memory usage: used = 5264.23, free = 2586.47 MB, total = 7850.71 MB
GPU memory usage: used = 5264.74, free = 2585.97 MB, total = 7850.71 MB
GPU memory usage: used = 5265, free = 2585.71 MB, total = 7850.71 MB
GPU memory usage: used = 5265.07, free = 2585.63 MB, total = 7850.71 MB
GPU memory usage: used = 5265.54, free = 2585.16 MB, total = 7850.71 MB
GPU memory usage: used = 5267.22, free = 2583.48 MB, total = 7850.71 MB

And when I used ./tegrastats, it show like :

RAM 4454/7851MB (lfb 519x4MB) cpu [38%@1114,0%@345,0%@345,38%@1112,40%@1113,41%@1113] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 519x4MB) cpu [37%@498,0%@345,0%@345,37%@499,35%@498,37%@499] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 519x4MB) cpu [43%@960,0%@345,0%@345,39%@959,35%@959,34%@959] EMC 28%@1866 APE 150 NVDEC 1203 GR3D 82%@1300
RAM 4455/7851MB (lfb 519x4MB) cpu [35%@960,0%@346,0%@345,37%@1266,36%@1266,38%@1267] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 519x4MB) cpu [39%@670,0%@345,0%@345,36%@653,37%@652,36%@652] EMC 30%@1866 APE 150 NVDEC 1203 GR3D 96%@1300
RAM 4455/7851MB (lfb 519x4MB) cpu [38%@1116,0%@345,0%@345,35%@1113,34%@1113,35%@1113] EMC 30%@1866 APE 150 NVDEC 1203 GR3D 99%@1236
RAM 4455/7851MB (lfb 519x4MB) cpu [36%@806,0%@345,0%@345,37%@806,35%@806,37%@806] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 519x4MB) cpu [41%@806,0%@345,0%@345,36%@762,39%@806,40%@805] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 517x4MB) cpu [39%@1259,0%@345,0%@345,37%@1142,40%@1140,41%@1140] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 517x4MB) cpu [36%@499,0%@345,0%@345,36%@499,44%@499,37%@499] EMC 28%@1866 APE 150 NVDEC 1203 GR3D 91%@1300
RAM 4455/7851MB (lfb 517x4MB) cpu [38%@806,0%@345,0%@345,35%@806,42%@806,36%@806] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 82%@1300
RAM 4456/7851MB (lfb 517x4MB) cpu [37%@1421,0%@345,0%@345,39%@1420,38%@1420,35%@1420] EMC 30%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 517x4MB) cpu [43%@959,8%@345,6%@345,42%@960,47%@960,46%@958] EMC 31%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4456/7851MB (lfb 517x4MB) cpu [38%@806,0%@345,0%@345,36%@805,35%@806,35%@806] EMC 30%@1866 APE 150 NVDEC 1203 GR3D 82%@1300
RAM 4456/7851MB (lfb 517x4MB) cpu [40%@499,0%@345,0%@345,37%@499,33%@499,34%@499] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4455/7851MB (lfb 516x4MB) cpu [47%@959,5%@345,5%@346,46%@1113,48%@1113,50%@1114] EMC 31%@1866 APE 150 NVDEC 1203 GR3D 94%@1300
RAM 4458/7851MB (lfb 516x4MB) cpu [40%@959,1%@1677,2%@1674,45%@959,37%@959,36%@960] EMC 33%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4458/7851MB (lfb 516x4MB) cpu [35%@652,17%@1574,53%@1574,35%@653,34%@652,38%@752] EMC 34%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4458/7851MB (lfb 516x4MB) cpu [33%@498,6%@345,1%@345,37%@499,31%@499,28%@499] EMC 32%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4459/7851MB (lfb 516x4MB) cpu [38%@1114,8%@1333,25%@1829,43%@1108,35%@1113,37%@1112] EMC 31%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4459/7851MB (lfb 516x4MB) cpu [34%@653,9%@499,1%@499,31%@652,39%@653,31%@655] EMC 30%@1866 APE 150 NVDEC 1203 GR3D 99%@1300
RAM 4459/7851MB (lfb 516x4MB) cpu [42%@1881,0%@960,0%@960,39%@1881,36%@1881,37%@1880] EMC 29%@1866 APE 150 NVDEC 1203 GR3D 88%@1300

Why the used GPU used > RAM used ? and when the gpu mem free is seems full , I could start a process and it would not crash?

What’s the demon do you use to check the GPU memory?

#include <iostream>
#include <unistd.h>
#include "cuda.h"

int main()
{
    // show memory usage of GPU
    size_t free_byte ;
    size_t total_byte ;

    while (true )
    {
        cudaError_t cuda_status = cudaMemGetInfo( &free_byte, &total_byte ) ;

        if ( cudaSuccess != cuda_status ){
            std::cout << "Error: cudaMemGetInfo fails, " << cudaGetErrorString(cuda_status) << std::endl;
            exit(1);
        }

        double free_db = (double)free_byte ;
        double total_db = (double)total_byte ;
        double used_db = total_db - free_db ;

        std::cout << "GPU memory usage: used = " << used_db/1024.0/1024.0 << ", free = "
                  << free_db/1024.0/1024.0 << " MB, total = " << total_db/1024.0/1024.0 << " MB" << std::endl;
        sleep(1);
    }

    return 0;
}
nvcc test.cu -o test

there is no separate gpu memory in tegra. Both cpu and gpu uses cpu memory.
Both the application reads /proc/meninfo but
tegrastats does some special processing with the data available like below, so its less

usedmemory = mem->totalRAMkB - mem->freeRAMkB-mem->buffersRAMkB - mem->cachedRAMkB),

Could someone, please, explain why output of code above is different from [url]cudaMemGetInfo() how does it work?!? - CUDA Programming and Performance - NVIDIA Developer Forums output?

In last case, author also used cudaMemGetInfo() but memory amounts are totally different.

Can the GPU use any memory not used by the CPU?

Ex) CPU usage: 5GB, GPU usage: 3GB
Is it possible to use it like this?