Added Getting Started guide for TFLite (#8136)

Some initial doc changes for a Getting Started guide. The goal is to prioritize "Getting Started" for users and include
end to end examples, with explanations for what each stage does.
diff --git a/docs/website/docs/blog/2021-07-19-tflite-tosa.md b/docs/website/docs/blog/2021-07-19-tflite-tosa.md
index 90929b6..d1c26a8 100644
--- a/docs/website/docs/blog/2021-07-19-tflite-tosa.md
+++ b/docs/website/docs/blog/2021-07-19-tflite-tosa.md
@@ -41,4 +41,4 @@
 [Android Java app](https://github.com/not-jenni/iree-android-tflite-demo) that
 was forked from an existing TFLite demo app, swapping out the TFLite library
 for our own AAR.  More information on IREE’s TFLite frontend is available
-[here](../ml-frameworks/tensorflow-lite.md).
+[here](../getting-started/tflite.md).
diff --git a/docs/website/docs/deployment-configurations/cpu-dylib.md b/docs/website/docs/deployment-configurations/cpu-dylib.md
index e6bc143..615de7c 100644
--- a/docs/website/docs/deployment-configurations/cpu-dylib.md
+++ b/docs/website/docs/deployment-configurations/cpu-dylib.md
@@ -135,5 +135,5 @@
 [pypi]: https://pypi.org/user/google-iree-pypi-deploy/
 [python-bindings]: ../bindings/python.md
 [tf-hub-mobilenetv2]: https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification
-[tf-import]: ../ml-frameworks/tensorflow.md
-[tflite-import]: ../ml-frameworks/tensorflow-lite.md
+[tf-import]: ../getting-started/tensorflow.md
+[tflite-import]: ../getting-started/tensorflow-lite.md
diff --git a/docs/website/docs/deployment-configurations/gpu-cuda-rocm.md b/docs/website/docs/deployment-configurations/gpu-cuda-rocm.md
index b630b5e..9a34091 100644
--- a/docs/website/docs/deployment-configurations/gpu-cuda-rocm.md
+++ b/docs/website/docs/deployment-configurations/gpu-cuda-rocm.md
@@ -196,7 +196,7 @@
 [pypi]: https://pypi.org/user/google-iree-pypi-deploy/
 [python-bindings]: ../bindings/python.md
 [tf-hub-mobilenetv2]: https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification
-[tf-import]: ../ml-frameworks/tensorflow.md
-[tflite-import]: ../ml-frameworks/tensorflow-lite.md
+[tf-import]: ../getting-started/tensorflow.md
+[tflite-import]: ../getting-started/tensorflow-lite.md
 [cuda-toolkit]: https://developer.nvidia.com/cuda-downloads
 [rocm-toolkit]: https://rocmdocs.amd.com/en/latest/Installation_Guide/Installation_new.html
diff --git a/docs/website/docs/deployment-configurations/gpu-vulkan.md b/docs/website/docs/deployment-configurations/gpu-vulkan.md
index 07cfba5..041d0ea 100644
--- a/docs/website/docs/deployment-configurations/gpu-vulkan.md
+++ b/docs/website/docs/deployment-configurations/gpu-vulkan.md
@@ -203,7 +203,7 @@
 [python-bindings]: ../bindings/python.md
 [spirv]: https://www.khronos.org/registry/spir-v/
 [tf-hub-mobilenetv2]: https://tfhub.dev/google/tf2-preview/mobilenet_v2/classification
-[tf-import]: ../ml-frameworks/tensorflow.md
-[tflite-import]: ../ml-frameworks/tensorflow-lite.md
+[tf-import]: ../getting-started/tensorflow.md
+[tflite-import]: ../getting-started/tensorflow-lite.md
 [vulkan]: https://www.khronos.org/vulkan/
 [vulkan-sdk]: https://vulkan.lunarg.com/sdk/home/
diff --git a/docs/website/docs/getting-started/index.md b/docs/website/docs/getting-started/index.md
new file mode 100644
index 0000000..2429f1e
--- /dev/null
+++ b/docs/website/docs/getting-started/index.md
@@ -0,0 +1,60 @@
+# Getting Started Guide
+
+## Setup
+
+Use the following command for the default installation, or check out the
+comprehensive installation [guide](../bindings/python.md) if your needs are more complex.
+
+```
+python -m pip install \
+  iree-compiler \
+  iree-runtime \
+  iree-tools-tf \
+  iree-tools-tflite \
+  iree-tools-xla
+```
+
+## Supported frameworks
+
+See end-to-end examples of how to execute a variety models on IREE. This covers
+the import, compilation, and execution of the provided model.
+
+* [TensorFlow](./tensorflow.md)
+* [TensorFlow Lite](./tflite.md)
+* [JAX](./jax.md)
+
+Importing from PyTorch and other frameworks is planned - stay tuned!
+
+## Samples
+
+Check out the samples in IREE's [colab/directory](https://github.com/google/iree/tree/main/colab),
+as well as the [iree-samples repository](https://github.com/google/iree-samples)
+respository, which contains workflow comparisons across frameworks.
+
+## Import
+
+Importing models takes known file types and imports into a form that the core IREE
+compiler is able to ingest. This import process is specific to each frontend and typically
+involves a number of stages:
+
+* Load the source format
+* Legalize operations specific each specific frontend to legal IR
+* Validate only IREE compatible operations remain
+* Write the remaining IR to a file
+
+This fully legalized form can then be compiled without dependencies on the 
+source model language.
+
+## Compilation
+
+During compilation we load an MLIR file and compile for the specified set of backends
+(CPU, GPU, etc).  Each of these backends creates custom native code to execute on the
+target device.  Once compiled, the resulting bytecode is exported to an IREE bytecode
+file that can be executed on the specified devices.
+
+## Execution
+
+The final stage is executing the now compiled module. This involves selecting what
+compute devices should be used, loading the module, and executing the module with the
+intended inputs. For testing, IREE includes a Python API. However, on mobile and embedded devices you
+will want to use the [C API](../deployment-configurations/index.md).
diff --git a/docs/website/docs/ml-frameworks/jax.md b/docs/website/docs/getting-started/jax.md
similarity index 100%
rename from docs/website/docs/ml-frameworks/jax.md
rename to docs/website/docs/getting-started/jax.md
diff --git a/docs/website/docs/ml-frameworks/tensorflow.md b/docs/website/docs/getting-started/tensorflow.md
similarity index 100%
rename from docs/website/docs/ml-frameworks/tensorflow.md
rename to docs/website/docs/getting-started/tensorflow.md
diff --git a/docs/website/docs/getting-started/tflite-cmd.md b/docs/website/docs/getting-started/tflite-cmd.md
new file mode 100644
index 0000000..a9bdb89
--- /dev/null
+++ b/docs/website/docs/getting-started/tflite-cmd.md
@@ -0,0 +1,33 @@
+# TFLite via Command Line
+
+IREE's tooling is divided into two components: import and compilation.
+
+1. The import tool converts the TFLite flatbuffer to an IREE compatible form,
+validating that only IREE compatible operations remain. Containing a combination of TOSA
+and IREE operations.
+2. The compilation stage generates the bytecode module for a list of targets, which can
+be executed by IREE.
+
+These two stages can be completed entirely via the command line.
+
+```shell
+WORKDIR="/tmp/workdir"
+TFLITE_URL="https://storage.googleapis.com/iree-model-artifacts/tflite-integration-tests/posenet_i8.tflite"
+TFLITE_PATH=${WORKDIR}/model.tflite
+IMPORT_PATH=${WORKDIR}/tosa.mlir
+MODULE_PATH=${WORKDIR}/module.vmfb
+
+# Fetch the sample model
+wget ${TFLITE_URL} -O ${TFLITE_PATH}
+
+# Import the sample model to an IREE compatible form
+iree-import-tflite ${TFLITE_PATH} -o ${IMPORT_PATH}
+
+# Compile for the CPU backend
+iree-translate \
+    --iree-mlir-to-vm-bytecode-module \
+    --iree-input-type=tosa \
+    --iree-hal-target-backends=dylib-llvm-aot \
+    ${IMPORT_PATH} \
+    -o ${MODULE_PATH}
+```
diff --git a/docs/website/docs/getting-started/tflite-python.md b/docs/website/docs/getting-started/tflite-python.md
new file mode 100644
index 0000000..a46ef10
--- /dev/null
+++ b/docs/website/docs/getting-started/tflite-python.md
@@ -0,0 +1,79 @@
+# TFLite via Python
+
+The example below demonstrates downloading, compiling, and executing a TFLite
+model using the Python API. This includes some initial setup to declare global
+variables, download the sample module, and download the sample inputs.
+
+Declaration of absolute paths for the sample repo and import all required libraries.
+The default setup uses the CPU backend as the only target. This can be reconfigured
+to select alternative targets.
+
+```python
+import iree.compiler.tflite as iree_tflite_compile
+import iree.runtime as iree_rt
+import numpy
+import os
+import urllib.request
+
+from PIL import Image
+
+workdir = "/tmp/workdir"
+os.makedirs(workdir, exist_ok=True)
+
+tfliteFile = "/".join([workdir, "model.tflite"])
+jpgFile = "/".join([workdir, "input.jpg"])
+tfliteIR = "/".join([workdir, "tflite.mlir"])
+tosaIR = "/".join([workdir, "tosa.mlir"])
+bytecodeModule = "/".join([workdir, "iree.vmfb"])
+
+backends = ["dylib-llvm-aot"]
+config = "dylib"
+```
+
+The TFLite sample model and input are downloaded locally.
+
+```python
+tfliteUrl = "https://storage.googleapis.com/iree-model-artifacts/tflite-integration-tests/posenet_i8.tflite"
+jpgUrl = "https://storage.googleapis.com/iree-model-artifacts/tflite-integration-tests/posenet_i8_input.jpg"
+
+urllib.request.urlretrieve(tfliteUrl, tfliteFile)
+urllib.request.urlretrieve(jpgUrl, jpgFile)
+```
+
+Once downloaded we can compile the model for the selected backends. Both the TFLite and TOSA representations
+of the model are saved for debugging purposes. This is optional and can be omitted.
+
+```python
+iree_tflite_compile.compile_file(
+  tfliteFile,
+  input_type="tosa",
+  output_file=bytecodeModule,
+  save_temp_tfl_input=tfliteIR,
+  save_temp_iree_input=tosaIR,
+  target_backends=backends,
+  import_only=False)
+```
+
+After compilation is completed we configure the VmModule using the dylib configuration and compiled
+IREE module.
+
+```python
+config = iree_rt.Config("dylib")
+context = iree_rt.SystemContext(config=config)
+with open(bytecodeModule, 'rb') as f:
+  vm_module = iree_rt.VmModule.from_flatbuffer(f.read())
+  context.add_vm_module(vm_module)
+```
+
+Finally, the IREE module is loaded and ready for execution. Here we load the sample image, manipulate to
+the expected input size, and execute the module. By default TFLite models include a single
+function named 'main'. The final results are printed.
+
+```python
+im = numpy.array(Image.open(jpgFile).resize((192, 192))).reshape((1, 192, 192, 3))
+args = [im]
+
+invoke = context.modules.module["main"]
+iree_results = invoke(*args)
+print(iree_results)
+```
diff --git a/docs/website/docs/getting-started/tflite.md b/docs/website/docs/getting-started/tflite.md
new file mode 100644
index 0000000..c276350
--- /dev/null
+++ b/docs/website/docs/getting-started/tflite.md
@@ -0,0 +1,54 @@
+# TFLite Integration
+
+IREE supports compiling and running TensorFlow Lite programs stored as [TFLite
+flatbuffers](https://www.tensorflow.org/lite/guide). These files can be
+imported into an IREE-compatible format then compiled to a series of backends.
+
+## Prerequisites
+
+Install TensorFlow-Lite specific dependencies using pip:
+
+```shell
+python -m pip install \
+  iree-compiler \
+  iree-runtime \
+  iree-tools-tflite
+```
+
+- [Command Line](./tflite-cmd.md)
+- [Python API](./tflite-python.md)
+
+## Troubleshooting
+
+Failures during the import step usually indicate a failure to lower from 
+TensorFlow Lite's operations to TOSA, the intermediate representation used by
+IREE. Many TensorFlow Lite operations are not fully supported, particularly
+those than use dynamic shapes. File an issue to IREE's TFLite model support
+[project](https://github.com/google/iree/projects/42). 
+
+
+## Additional Samples
+
+* The
+[tflitehub folder](https://github.com/google/iree-samples/tree/main/tflitehub)
+in the [iree-samples repository](https://github.com/google/iree-samples)
+contains test scripts to compile, run, and compare various TensorFlow Lite
+models sourced from [TensorFlow Hub](https://tfhub.dev/).
+
+* An example smoke test of the
+[TensorFlow Lite C API](https://github.com/google/iree/tree/main/bindings/tflite)
+is available
+[here](https://github.com/google/iree/blob/main/bindings/tflite/smoke_test.cc).
+
+| Colab notebooks |  |
+| -- | -- |
+Text classification with TFLite and IREE | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/iree/blob/main/colab/tflite_text_classification.ipynb)
+
+!!! todo
+
+    [Issue#3954](https://github.com/google/iree/issues/3954): Add documentation
+    for an Android demo using the
+    [Java TFLite bindings](https://github.com/google/iree/tree/main/bindings/tflite/java),
+    once it is complete at
+    [not-jenni/iree-android-tflite-demo](https://github.com/not-jenni/iree-android-tflite-demo).
+
diff --git a/docs/website/docs/index.md b/docs/website/docs/index.md
index 60e9497..5523826 100644
--- a/docs/website/docs/index.md
+++ b/docs/website/docs/index.md
@@ -64,12 +64,14 @@
 
 ## Workflow overview
 
-Using IREE involves these general steps:
+Specific examples outlining IREE's workflow can be found in the
+[User Getting Started Guide](./getting-started/index.md). Using IREE involves the following
+general steps:
 
 1. **Import your model**
 
-    Work in your [framework of choice](./ml-frameworks), then run your model
-    through one of IREE's import tools.
+    Develop your program using one of the [supported frameworks](./getting-started/#supported-frameworks), then run your model
+    using one of IREE's import tools.
 
 2. **Select your [deployment configuration](./deployment-configurations)**
 
@@ -89,9 +91,9 @@
 IREE supports importing models from a growing list of ML frameworks and model
 formats:
 
-* [TensorFlow](ml-frameworks/tensorflow.md)
-* [TensorFlow Lite](ml-frameworks/tensorflow-lite.md)
-* [JAX](ml-frameworks/jax.md)
+* [TensorFlow](getting-started/tensorflow.md)
+* [TensorFlow Lite](getting-started/tflite.md)
+* [JAX](getting-started/jax.md)
 
 ### Selecting deployment configurations
 
diff --git a/docs/website/docs/ml-frameworks/index.md b/docs/website/docs/ml-frameworks/index.md
deleted file mode 100644
index ebdae58..0000000
--- a/docs/website/docs/ml-frameworks/index.md
+++ /dev/null
@@ -1,18 +0,0 @@
-# ML frameworks
-
-## Supported frameworks
-
-IREE supports importing models from
-
-* [TensorFlow](./tensorflow.md)
-* [TensorFlow Lite](./tensorflow-lite.md)
-* [JAX](./jax.md)
-
-Importing from PyTorch and other frameworks is planned - stay tuned!
-
-## Samples
-
-Check out the samples in IREE's
-[colab/ directory](https://github.com/google/iree/tree/main/colab) and the
-[iree-samples repository](https://github.com/google/iree-samples) for examples
-and workflow comparisons across frameworks.
diff --git a/docs/website/docs/ml-frameworks/tensorflow-lite.md b/docs/website/docs/ml-frameworks/tensorflow-lite.md
deleted file mode 100644
index 21c1c3b..0000000
--- a/docs/website/docs/ml-frameworks/tensorflow-lite.md
+++ /dev/null
@@ -1,96 +0,0 @@
-# TensorFlow Lite Integration
-
-IREE supports compiling and running pre-trained TensorFlow Lite (TFLite)
-models.  It converts a model to
-[TOSA MLIR](https://mlir.llvm.org/docs/Dialects/TOSA/), then compiles it into a
-VM module.
-
-## Prerequisites
-
-Download a pre-trained TFLite model from the list of
-[hosted models](https://www.tensorflow.org/lite/guide/hosted_models), or use the
-[TensorFlow Lite converter](https://www.tensorflow.org/lite/convert) to convert
-a TensorFlow model to a .tflite flatbuffer.
-
-Install IREE pip packages, either from pip or by
-[building from source](../building-from-source/python-bindings-and-importers.md):
-
-```shell
-python -m pip install \
-  iree-compiler \
-  iree-runtime \
-  iree-tools-tflite
-```
-
-!!! warning
-    The TensorFlow Lite package is currently only available on Linux and macOS.
-    It is not available on Windows yet (see
-    [this issue](https://github.com/google/iree/issues/6417)).
-
-## Importing models
-
-Fist, import the TFLite model to TOSA MLIR:
-
-```shell
-iree-import-tflite \
-  sample.tflite \
-  -o sample.mlir
-```
-
-Next, compile the TOSA MLIR to a VM flatbuffer, using either the command line
-tools or the [Python API](https://google.github.io/iree/bindings/python/):
-
-#### Using the command-line tool
-
-``` shell
-iree-translate \
-  --iree-mlir-to-vm-bytecode-module \
-  --iree-input-type=tosa \
-  --iree-hal-target-backends=vmvx \
-  sample.mlir \
-  -o sample.vmfb
-```
-
-#### Using the python API
-
-``` python
-from iree.compiler import compile_str
-with open('sample.mlir') as sample_tosa_mlir:
-  compiled_flatbuffer = compile_str(sample_tosa_mlir.read(),
-    input_type="tosa",
-    target_backends=["vmvx"],
-    extra_args=["--iree-native-bindings-support=false",
-      "--iree-tflite-bindings-support"])
-```
-
-!!! todo
-
-    [Issue#5462](https://github.com/google/iree/issues/5462): Link to
-    TensorFlow Lite bindings documentation once it has been written.
-
-The flatbuffer can then be loaded to a VM module and run through IREE's runtime.
-
-## Samples
-
-* The
-[tflitehub folder](https://github.com/google/iree-samples/tree/main/tflitehub)
-in the [iree-samples repository](https://github.com/google/iree-samples)
-contains test scripts to compile, run, and compare various TensorFlow Lite
-models sourced from [TensorFlow Hub](https://tfhub.dev/).
-
-* An example smoke test of the
-[TensorFlow Lite C API](https://github.com/google/iree/tree/main/bindings/tflite)
-is available
-[here](https://github.com/google/iree/blob/main/bindings/tflite/smoke_test.cc).
-
-| Colab notebooks |  |
-| -- | -- |
-Text classification with TFLite and IREE | [![Open in Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/iree/blob/main/colab/tflite_text_classification.ipynb)
-
-!!! todo
-
-    [Issue#3954](https://github.com/google/iree/issues/3954): Add documentation
-    for an Android demo using the
-    [Java TFLite bindings](https://github.com/google/iree/tree/main/bindings/tflite/java),
-    once it is complete at
-    [not-jenni/iree-android-tflite-demo](https://github.com/not-jenni/iree-android-tflite-demo).
diff --git a/docs/website/mkdocs.yml b/docs/website/mkdocs.yml
index 3c9e1fc..c7da8e5 100644
--- a/docs/website/mkdocs.yml
+++ b/docs/website/mkdocs.yml
@@ -94,11 +94,11 @@
 # Note: may include external links and titles are optional for internal links
 nav:
   - Home: 'index.md'
-  - 'ML frameworks':
-      - 'ml-frameworks/index.md'
-      - TensorFlow: 'ml-frameworks/tensorflow.md'
-      - TensorFlow Lite: 'ml-frameworks/tensorflow-lite.md'
-      - JAX: 'ml-frameworks/jax.md'
+  - 'Getting Started':
+      - 'getting-started/index.md'
+      - TensorFlow: 'getting-started/tensorflow.md'
+      - TensorFlow Lite: 'getting-started/tflite.md'
+      - JAX: 'getting-started/jax.md'
   - 'Deployment configurations':
       - 'deployment-configurations/index.md'
       - CPU - Dylib: 'deployment-configurations/cpu-dylib.md'