Refresh website ML framework / Python overview pages. (#13572)
Doing some general cleanup before working on JAX docs
(https://github.com/openxla/iree/issues/5454)
### Homepage
| Current website | Demo for this PR |
| -- | -- |
[home](https://openxla.github.io/iree/) |
[home](https://scotttodd.github.io/iree/)
* Reworked "Workflow overview" section
### bindings/python
| Current website | Demo for this PR |
| -- | -- |
[bindings/python](https://openxla.github.io/iree/bindings/python/) |
[bindings/python](https://scotttodd.github.io/iree/bindings/python/)
* Inlined instructions for installing nightly releases into a content
tab
* Dropped `iree-tools-*` from the base instructions
* Added icons to section headings
### Getting Started (frameworks intro)
| Current website | Demo for this PR |
| -- | -- |
[getting-started](https://openxla.github.io/iree/getting-started/) |
[getting-started](https://scotttodd.github.io/iree/getting-started/)
* Rewrote "export/import" explanation
* Removed pip install instructions (each subpage lists what it needs)
* Added icons to section headings
* Tweaked sample links
diff --git a/docs/website/docs/bindings/python.md b/docs/website/docs/bindings/python.md
index eecaa3b..3d94500 100644
--- a/docs/website/docs/bindings/python.md
+++ b/docs/website/docs/bindings/python.md
@@ -24,9 +24,9 @@
!!! Caution "Caution - Operating system support"
Packages are currently only available on Linux and macOS. They are not
available on Windows yet (see
- [this issue](https://github.com/openxla/iree/issues/6417)).
+ [this issue](https://github.com/openxla/iree/issues/13484)).
-## Prerequisites
+## :octicons-download-16: Prerequisites
To use IREE's Python bindings, you will first need to install
[Python 3](https://www.python.org/downloads/) and
@@ -63,14 +63,12 @@
## Installing IREE packages
-### Prebuilt packages
+### :octicons-package-16: Prebuilt packages
-Stable release packages are published to
-[PyPI](https://pypi.org/user/google-iree-pypi-deploy/).
+=== "Stable releases"
-=== "Minimal"
-
- To install just the core IREE packages:
+ Stable release packages are
+ [published to PyPI](https://pypi.org/user/google-iree-pypi-deploy/).
``` shell
python -m pip install \
@@ -78,29 +76,20 @@
iree-runtime
```
-=== "All packages"
+=== ":material-alert: Nightly releases"
- To install IREE packages with tools for all frontends:
+ Nightly releases are published on
+ [GitHub releases](https://github.com/openxla/iree/releases).
``` shell
python -m pip install \
+ --find-links https://openxla.github.io/iree/pip-release-links.html \
+ --upgrade \
iree-compiler \
- iree-runtime \
- iree-tools-tf \
- iree-tools-tflite
+ iree-runtime
```
-!!! Tip "Tip - Nightly releases"
-
- Unstable packages are also published nightly on
- [GitHub releases](https://github.com/openxla/iree/releases). To use these,
- run `pip install` with this option:
-
- ```
- --find-links https://openxla.github.io/iree/pip-release-links.html
- ```
-
-### Building from source
+### :material-hammer-wrench: Building from source
See [Building Python bindings](../../building-from-source/getting-started/#python-bindings)
page for instructions for building from source.
diff --git a/docs/website/docs/getting-started/index.md b/docs/website/docs/getting-started/index.md
index 3e2c46e..e0a5f82 100644
--- a/docs/website/docs/getting-started/index.md
+++ b/docs/website/docs/getting-started/index.md
@@ -1,63 +1,52 @@
# Getting Started Guide
-## Setup
+IREE supports popular machine learning frameworks using the same underlying
+technology.
-Use the following command for the default installation, or check out the
-comprehensive installation [guide](../bindings/python.md) if your needs are
-more complex.
+## :octicons-list-unordered-16: Supported frameworks
-``` bash
-python -m pip install \
- iree-compiler \
- iree-runtime \
- iree-tools-tf \
- iree-tools-tflite
-```
+See end-to-end examples of how to use each framework with IREE:
-## Supported frameworks
-
-See end-to-end examples of how to execute a variety models on IREE. This covers
-the import, compilation, and execution of the provided model.
-
-* [TensorFlow](./tensorflow.md)
-* [TensorFlow Lite](./tflite.md)
+* [TensorFlow](./tensorflow.md) and [TensorFlow Lite](./tflite.md)
* [JAX](./jax.md)
* [PyTorch](./pytorch.md)
Importing from other frameworks is planned - stay tuned!
-## Samples
+## :octicons-code-16: Samples
Check out the samples in IREE's
-[samples/colab/ directory](https://github.com/openxla/iree/tree/main/colab),
-as well as the [iree-samples repository](https://github.com/iree-org/iree-samples),
-which contains workflow comparisons across frameworks.
+[`samples/` directory](https://github.com/openxla/iree/tree/main/samples),
+as well as the
+[iree-samples repository](https://github.com/iree-org/iree-samples).
-## Import
+## :octicons-package-dependents-16: Export/Import
-Importing models takes known file types and imports into a form that the core
-IREE compiler is able to ingest. This import process is specific to each
-frontend and typically involves a number of stages:
+Each machine learning framework has some "export" mechanism that snapshots the
+structure and data in your program. These exported programs can then be
+"imported" into IREE's compiler by using either a stable import format or one of
+IREE's importer tools. This export/import process is specific to each frontend
+and typically involves a number of stages:
-* Load the source format
-* Legalize operations specific each specific frontend to legal IR
-* Validate only IREE compatible operations remain
-* Write the remaining IR to a file
+1. Capture/trace/freeze the ML model into a graph
+2. Write that graph to an interchange format (e.g. SavedModel, TorchScript)
+3. Load the saved program into an import tool and convert to MLIR
+4. Legalize the graph's operations so only IREE-compatible operations remain
+5. Write the imported MLIR to a file
-This fully legalized form can then be compiled without dependencies on the
-source model language.
+This fully imported form can then be compiled indepedently of the source
+language and framework.
-## Compilation
+## :octicons-gear-16: Compilation
During compilation we load an MLIR file and compile for the specified set of
backends (CPU, GPU, etc). Each of these backends creates custom native code to
-execute on the target device. Once compiled, the resulting bytecode is
-exported to an IREE bytecode file that can be executed on the specified devices.
+execute on the target device. Once compiled, the resulting artifact can be
+executed on the specified devices using IREE's runtime.
-## Execution
+## :octicons-rocket-16: Execution
The final stage is executing the now compiled module. This involves selecting
what compute devices should be used, loading the module, and executing the
-module with the intended inputs. For testing, IREE includes a Python API.
-However, on mobile and embedded devices you will want to use the
-[C API](../deployment-configurations/index.md).
+module with the intended inputs. IREE provides several
+[language bindings](../bindings/index.md) for it's runtime API.
diff --git a/docs/website/docs/index.md b/docs/website/docs/index.md
index 25d4fbc..22d45f2 100644
--- a/docs/website/docs/index.md
+++ b/docs/website/docs/index.md
@@ -65,36 +65,34 @@
## Workflow overview
-Specific examples outlining IREE's workflow can be found in the
-[User Getting Started Guide](./getting-started/index.md). Using IREE involves
-the following general steps:
+Using IREE involves the following general steps:
1. **Import your model**
Develop your program using one of the
- [supported frameworks](./getting-started/#supported-frameworks), then run
- your model using one of IREE's import tools.
+ [supported frameworks](./getting-started/#supported-frameworks), then
+ import into IREE
2. **Select your [deployment configuration](./deployment-configurations/)**
- Identify your target platform, accelerator(s), and other constraints.
+ Identify your target platform, accelerator(s), and other constraints
3. **Compile your model**
- Compile through IREE, picking compilation targets based on your
- deployment configuration.
+ Compile through IREE, picking settings based on your deployment
+ configuration
4. **Run your model**
- Use IREE's runtime components to execute your compiled model.
+ Use IREE's runtime components to execute your compiled model
### Importing models from ML frameworks
IREE supports importing models from a growing list of ML frameworks and model
formats:
-* [TensorFlow](getting-started/tensorflow.md)
-* [TensorFlow Lite](getting-started/tflite.md)
+* [TensorFlow](getting-started/tensorflow.md) and
+ [TensorFlow Lite](getting-started/tflite.md)
* [JAX](getting-started/jax.md)
* [PyTorch](getting-started/pytorch.md)