We have some TensorFlow model pb file, and want to accelerate it inference speed, so we found the uff-converter-tf tool in TensorRT 3.0 RC SDK.
uff-converter-tf generate too many customer node, and all of them is come from TensorFlow frozen pb file. We event update the converter.py file to make it go further, but we still stuck.
def convert_tf2uff_field(cls, code, val):
if isinstance(val, AttrValue):
val = getattr(val, code)
if code == 'i':
if hasattr(val, 'i'):
return int(val.i)
else:
return int (val)
elif code == 'f':
return float(val)
elif code == 's':
return str(val)
elif code == 'b':
return bool(val)
elif code == 'type':
return TensorFlowToUFFConverter.convert_tf2numpy_dtype(val.type)
elif code == 'list':
fields = val.list.ListFields()
if len(fields) == 0:
return None
elif len(fields) > 1:
raise ValueError("Invalid list field")
else:
field_desc, field_value = fields[0]
sub_code = field_desc.name
uff_code = {'i': 'i', 'f': 'f', 's': 's', 'b': 'b',
'type': 'dtype', 'list': 'dim_orders'}[code]
return uff.List(uff_code, [{1:cls.convert_tf2uff_field(sub_code, v) for v in field_value}])
elif code == 'shape':
shp = val.dim
if hasattr(shp, 'unknown_rank') and shp.unknown_rank:
raise ValueError(
"Unsupported: shape attribute with unknown rank")
return uff.List('i', [dim.size for dim in shp])
else:
print(val)
raise TypeError("Unsupported field type:" + code)
we stuck at:split_3 output to Mul[0-63], but the tool execute
raise UFFException('.../split_3:63' input doesn't exist)
the Mul[63] input should connect split_3 the 63 indexed output, what we should do to make pb2uff happen?