Don’t worry I figured it out (it seems obvious now). The weights for the moving average in my batchNorm layers where set to zero. I modified them to be non-zero using Python, now it is okay.
# load caffe model
net_model = caffe.Net(deploy_file_path, caffemodel_path)
# changing all moving average factor to 1 (or others you want)
for p in net_model.params:
if get_layer_by_name(str(p)).type == "BatchNorm":
net_model.params[str(p)][2].data[0] = 1
# than save it
net_model.save(caffemodel_path)
If you want any help on this issue, please let me know.