NOTE: Iree's Python API is currently being reworked. Some of these instructions may be in a state of flux as they document the end state.
You should have already completed the Python getting started guide
IREE's integrations into parts of TensorFlow are mediated by standalone binaries that can be built individually or installed from a distribution. These binaries are: iree-tf-import, iree-import-tflite, and iree-import-xla. They are configured in the iree_tf_compiler BUILD file. You have a few options for how to obtain these binaries
TensorFlow only supports the Bazel build system. If building any parts of TensorFlow yourself, you must have a working bazel command on your path. See the relevant “OS with Bazel” getting started doc for more information.
Warning:
Building TF binaries takes a very long time, especially on smallish machines (IREE devs that work on these typically use machines with 96 cores)
For example to run TensorFlow-based tests, you can build iree-import-tf
cd integrations/tensorflow bazel build //iree_tf_compiler:iree-import-tf
TODO(laurenzo): Document how this works
The following CMake flags control
-DIREE_BUILD_TENSORFLOW_COMPILER=ON: build the TensorFlow integration.-DIREE_BUILD_TFLITE_COMPILER=ON: build the TFLite integration.-DIREE_BUILD_XLA_COMPILER=ON: build the XLA integration.-DIREE_TF_TOOLS_ROOT: path to directory containing separately-built tools for the enabled integrations.The IREE Python bindings are only buildable with CMake. Continuing from above, to build with TensorFlow support, add -DIREE_BUILD_TENSORFLOW_COMPILER=ON to your invocation. If you obtained the integration binaries by a method other than building them with Bazel, you will also need to pass the path to the directory in which they are located: -DIREE_TF_TOOLS_ROOT=path/to/dir (it defaults to the location where the Bazel build creates them). From the IREE root directory:
$ cmake -B ../iree-build-tf -G Ninja \ -DIREE_BUILD_PYTHON_BINDINGS=ON \ -DIREE_BUILD_TENSORFLOW_COMPILER=ON . $ cmake --build ../iree-build-tf
To run tests for the TensorFlow integration, which include end-to-end backend comparison tests:
$ cd ../iree-build-tf $ ctest -R 'tensorflow_e2e|bindings/python|integrations/tensorflow/' \ --output-on-failure # Or run individually as: $ export PYTHONPATH=$(pwd)/bindings/python # This is a Python 3 program. On some systems, such as Debian derivatives, # use 'python3' instead of 'python'. $ python ../iree/integrations/tensorflow/e2e/simple_arithmetic_test.py \ --target_backends=iree_vmla --artifacts_dir=/tmp/artifacts