Building Tensorflow 2.0 and Tensorflow Lite Python Runtime

Hi,

I have successfully built natively Tensorflow 2.0 for the Jetson Nano (By the way, I am preparing a github project to share my building scripts and binaries. This should arrives next week)

Now I would also like to build the associated Tensorflow Lite Runtime for Python 3.6 (not provided by Google), but I have an error which looks like a system error, not directly linked to the tensorflow source code itself. I am using the latest version of the Jetpack and Ubuntu packages are up-to-date. I am using Python 3.6 has provided in the JetPack

While trying to debug the origin of the problem I made this small test which reproduce exactly the same error.

Any idea of what could cause this error between the Python subprocess.check_call() and the make command ?

(virtual-env) fab@JetsonNano:/opt/local/tmp/tensorflow/tensorflow/lite/tools/pip_package$ make
make docker-image -- build docker image
make docker-shell -- run shell inside the docker image
make docker-build -- build wheel inside the docker image
make clean        -- remove built wheel files

(virtual-env) fab@JetsonNano:/opt/local/tmp/tensorflow/tensorflow/lite/tools/pip_package$ python
Python 3.6.8 (default, Oct  7 2019, 12:59:55) 
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import subprocess
>>> subprocess.check_call(["echo", "Hello World"])
Hello World
0
>>> subprocess.check_call(["make", "clean"])
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python3.6/subprocess.py", line 306, in check_call
    retcode = call(*popenargs, **kwargs)
  File "/usr/lib/python3.6/subprocess.py", line 287, in call
    with Popen(*popenargs, **kwargs) as p:
  File "/usr/lib/python3.6/subprocess.py", line 729, in __init__
    restore_signals, start_new_session)
  File "/usr/lib/python3.6/subprocess.py", line 1364, in _execute_child
    raise child_exception_type(errno_num, err_msg, err_filename)
OSError: [Errno 8] Exec format error: 'make'
>>> exit()

(virtual-env) fab@JetsonNano:/opt/local/tmp/tensorflow/tensorflow/lite/tools/pip_package$ make --version
GNU Make 4.1
Built for aarch64-unknown-linux-gnu
Copyright (C) 1988-2014 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
(virtual-env) fab@JetsonNano:/opt/local/tmp/tensorflow/tensorflow/lite/tools/pip_package$

Hi,

Could you check if this comment helps?
[url]python - Flask CLI throws 'OSError: [Errno 8] Exec format error' when run through docker-compose - Stack Overflow

Thanks.

@JetFab Any progress on the github repository? It would be awesome if you would share your build of TF2, especially if you have a wheel!

I’ve been trying to get it to build for a while now. It takes more than 24 hours to compile on the nano and it’s a bit of a pain to get all of the dependencies set up correctly (e.g. compile bazel, install keras_preprocessing, etc.). I launch a build, wait 24 hours, see why the build failed and adjust one little thing, then wait 24 hours again. Plus the bazel build uses about 10GB of cache space during the build, which is a lot of resources on a nano. It’s a nightmare!

Hello,

So far I am still unable to build TF Lite 2.0 runtime for python 3.6 (aarch64) but I will focus first on finishing the tests with tf.lite in TF 2.0

@avejidah , you’ re right it took ~1.5 day to build everything and especially TF2.0 natively on the Nano. It took me lots of tries to find the right settings. I’ll do my best to push at least the building script, the Bazel bin and TF wheel package on github by the end of the week. The test Jupiter notebook might takes a few more days to be ready (and we’ll see for the TF lite python runtime after that)

Awesome, thanks @JetFab. And, for what it’s worth, I got TF2 to build on the nano finally.

Hi,

As promised, here are my building scripts for building Tensorflow 2.0 GPU for the Jetson Nano (AARM64)

Lots of stuff to configure the Nano board borrowed from JetsonHacks ;-) Other tips to build Tensorflow gathered from this forum too)

  • A Tensorflow 2.0 GPU for Python 3.6 wheel package is available in the release section (with a bazel binary too)

  • The Jupyter Notebook is still a work in progress: Bad results with tf.lite and the TPU, but that’s for another forum… ;-D Need to experiment with Tensor RT and Tensorflow too.

-Otherwise, still unable to build the Tensorflow Lite 2.0 runtime for Python 3.6. Any help is welcome !

  • Tensorflow is quite heavy on the RAM so don’t forget to setup a swapfile (if you are not installing the stuff via my scripts)

Pretty impressed by the Jetson Nano board (And quite surprise that training a small Neural Network on this inference board looks to occur on the GPU !)

Anyway, here is the repo (Note: I have not rerun the scripts before uploading them. They should work ok. just, let me know if this works or not in case you are testing them)

https://github.com/fdasilva59/DeepNano

Enjoy!

A few more details at the attempts to build Tensorflow 2.0 Lite Runtime:

subprocess.check_call(make_args(quiet=False))

calls by

os.system('make SHELL=/bin/bash BUILD_WITH_NNAPI=false -C /opt/local/tmp/tensorflow/tensorflow/lite/tools/pip_package/../../../..  TARGET=aarch64 -f tensorflow/lite/tools/make/Makefile -j 1' )

This seems to bypass the OSError, but the compilation fails a bit later

File "setup.py", line 122, in run
    out = super(CustomBuildExt, self).run()
  File "/usr/lib/python3.6/distutils/command/build_ext.py", line 339, in run
    self.build_extensions()
  File "/usr/lib/python3.6/distutils/command/build_ext.py", line 448, in build_extensions
    self._build_extensions_serial()
  File "/usr/lib/python3.6/distutils/command/build_ext.py", line 473, in _build_extensions_serial
    self.build_extension(ext)
  File "/usr/lib/python3.6/distutils/command/build_ext.py", line 558, in build_extension
    target_lang=language)
  File "/usr/lib/python3.6/distutils/ccompiler.py", line 717, in link_shared_object
    extra_preargs, extra_postargs, build_temp, target_lang)
  File "/usr/lib/python3.6/distutils/unixccompiler.py", line 170, in link
    libraries)
  File "/usr/lib/python3.6/distutils/ccompiler.py", line 1089, in gen_lib_options
    lib_opts.append(compiler.library_dir_option(dir))
  File "/usr/lib/python3.6/distutils/unixccompiler.py", line 218, in library_dir_option
    return "-L" + dir
TypeError: must be str, not int

I have not debugged the last error in details yet. (To be continued…)