Update docs for building TF and reproducing test failures. (#13295)
The workflow is the same, but we're no longer so dependent on using
Bazel or installing importers from the nightly releases.
Tested on Linux and Windows. Windows is strangely broken in TF/Keras and
I can't figure out why:
`ImportError: cannot import name 'deserialize_keras_object' from
partially initialized module 'keras.saving.legacy.serialization' (most
likely due to a circular import)` ([more logs
here](https://gist.github.com/ScottTodd/050ec7f1d8f489e618ef6158e129e853))
diff --git a/docs/developers/debugging/tf_integrations_test_repro.md b/docs/developers/debugging/tf_integrations_test_repro.md
index 0c2e989..c6c2c55 100644
--- a/docs/developers/debugging/tf_integrations_test_repro.md
+++ b/docs/developers/debugging/tf_integrations_test_repro.md
@@ -1,47 +1,76 @@
# Debugging failures in TF/TFLite integration tests.
-These are steps to reproduce/address failures in TF/TFLite integration tests. All steps here
-assume starting from the IREE root directory.
+These are steps to reproduce/address failures in TF/TFLite integration tests.
+These instructions are most stable on Linux, though they may work with a few
+tweaks on Windows and macOS.
-1. First setup the python environment as described [here](https://openxla.github.io/iree/building-from-source/python-bindings-and-importers/#environment-setup).
+All steps here assume starting from the IREE root directory.
-```
-python -m venv iree.venv
-source iree.venv/bin/activate
-```
+1. First create a Python virtual environment to install packages into:
-2. Install latest IREE release binaries. The importers are not expected to change much, so using the release binaries should work for most cases
+ ```bash
+ python -m venv iree-tf.venv
+ source iree-tf.venv/bin/activate
-```
-python -m pip install iree-compiler iree-runtime iree-tools-tf iree-tools-tflite --find-links https://openxla.github.io/iree/pip-release-links.html
-```
+ # Install test requirements
+ python -m pip install -r ./integrations/tensorflow/test/requirements.txt
+ ```
-3. Install TF nightly
+2. Install IREE's tools and Python bindings or build them from source
-```
-python -m pip install tf-nightly Pillow
-```
+ Install distributed packages
-4. Run the python test command line. The command can be obtained from the run file. For example, if iree_tfl_tests/llvmcpu_posenet_i8.run failed,
+ ```bash
+ # Install packages from nightly releases
+ # This should work for most cases, as the importers change infrequently
+ python -m pip install \
+ iree-compiler iree-runtime iree-tools-tf iree-tools-tflite \
+ --find-links https://openxla.github.io/iree/pip-release-links.html
+ ```
-```
-cd integrations/tensorflow/test/
-cat iree_tfl_tests/llvmcpu_posenet_i8.run
+ _OR_ build from source
-# REQUIRES: llvmcpu
-# RUN: %PYTHON -m iree_tfl_tests.posenet_i8_test --target_backend=llvmcpu --artifacts_dir=%t
+ ```bash
+ # Build Python bindings from source
+ cmake -G Ninja -B ../iree-build/ -DIREE_BUILD_PYTHON_BINDINGS=ON .
+ cmake --build ../iree-build/
-cd python/
-python -m iree_tfl_tests.posenet_i8_test --target_backend=llvmcpu --artifacts_dir=/tmp/posenet_i8_failure
-```
+ # Add IREE built-from-source Python packages to PYTHONPATH
+ source .env
-Note that the command can only be run under `integrations/tensorflow/test/python` directory.
+ # Install IREE TF/TFLite Python packages
+ python -m pip install integrations/tensorflow/python_projects/iree_tf
+ python -m pip install integrations/tensorflow/python_projects/iree_tflite
+ ```
-5. This will create an `iree_input.mlir` in the temp directory specified. Those can then be fed into `iree-compile` (built locally to reproduce the error)
+3. Run the python test command line
-```
-iree-compile \
- --iree-hal-target-backends=llvm-cpu \
- --iree-input-type=mhlo \
- iree_input.mlir
-```
+ The command can be obtained from the run file. For example, if
+ `iree_tfl_tests/llvmcpu_posenet_i8.run` failed,
+
+ ```bash
+ cd integrations/tensorflow/test/
+ cat iree_tfl_tests/llvmcpu_posenet_i8.run
+
+ # REQUIRES: llvmcpu
+ # RUN: %PYTHON -m iree_tfl_tests.posenet_i8_test --target_backend=llvmcpu --artifacts_dir=%t
+
+ cd python/
+ python -m iree_tfl_tests.posenet_i8_test --target_backend=llvmcpu --artifacts_dir=/tmp/posenet_i8_failure
+ ```
+
+ Note that the command can only be run under
+ `integrations/tensorflow/test/python` directory.
+
+4. Extract intermediate files and use with native tools
+
+ The test will create an `iree_input.mlir` in the temp directory specified.
+ Those can then be fed into `iree-compile` (built locally to reproduce the
+ error)
+
+ ```bash
+ iree-compile \
+ --iree-hal-target-backends=llvm-cpu \
+ --iree-input-type=mhlo \
+ iree_input.mlir
+ ```
diff --git a/integrations/tensorflow/iree_tf_compiler/README.md b/integrations/tensorflow/iree_tf_compiler/README.md
index 4e0e081..18f5d8d 100644
--- a/integrations/tensorflow/iree_tf_compiler/README.md
+++ b/integrations/tensorflow/iree_tf_compiler/README.md
@@ -1,24 +1,2 @@
-This directory should be the only one in IREE that pulls a dependency on `tf`
-dialect and related dialects.
-
-# Tools
-
-## Production Tools
-
-### iree-import-tf
-
-`iree-import-tf` provides a single entry-point for compiling TensorFlow saved
-models to "IREE Input Dialects" that can be fed to `iree-compile` or
-`iree-opt` and operated on further.
-
-#### Usage
-
-```shell
-iree-import-tf /path/to/saved_model_v2
-# Optional args: --tf-savedmodel-exported-names=subset,of,exported,names
-
-iree-import-tf /path/to/saved_model_v1 --tf-import-type=savedmodel_v1
-# Optional args:
-# --tf-savedmodel-exported-names=subset,of,exported,names
-# --tf-savedmodel-tags=serving
-```
+This directory should be the only one in IREE that pulls a dependency on
+TensorFlow C++ sources.