Adding `--iree-hal-preprocess-executables-with=` option. (#12313)

Adding `--iree-hal-preprocess-executables-with=` option.
This allows for an external tool or pass pipeline to do whatever it
wants to a hal.executable before translation. When using an external
tool the whole executable is passed on stdin and is expected to be
returned on stdout. Each executable is processed independently which
allows for parallelism, though it's possible to fork bomb and we may
want to limit the maximum number of concurrent invocations in the future
(semaphore around the process launch). The tool approach will still be
useful for simple cases of microkernels and bring-your-own-compiler even
once we do get a real plugin mechanism with shared libraries that can
register hooks at all stages of lowering.

There are two variants of the flag:

`--iree-hal-preprocess-executables-with="tool --args"`:
shell executes the given command line with the hal.executable
stdin/stdout. This allows users to implement their preprocessing in
whatever language they want (python, etc), use their own pre-built tools
instead of building all of iree-compile, and build out-of-tree binaries.
The process boundary also provides a layer of insulation against bad
behavior.

`--iree-hal-preprocess-executables-with=builtin.module(...)`:
standard MLIR pass pipeline syntax in iree-opt/iree-compile executed as
a dynamic pass pipeline. This should parallelize well and when the
passes can be built into iree-opt/iree-compile or registered with it via
a future plugin mechanism they'll be automatically picked up.

A simple test is used to demonstrate iree-opt as a tool with a custom
pass/pipeline. The intent is that users can build their own opt tools
out of tree including their own dialects, passes, patterns, etc and just
the IREE dialects to be able to parse the ops. The tools aren't intended
to be version shifted so no effort is spent on IR compatibility - a real
plugin mechanism can solve that in the future if they want.

From here a user can build their own iree-opt with their own additional
passes added, build their own whatever-opt with anything they want, etc.
The passes can check the hal.executable.variants for target
configuration and selectively process them to change their workgroup
count calculation function, add executable constants, add objects for
linking, or change the body IR. It's possible to go as far as completely
lowering the executables to their final dialect (LLVM/SPIR-V) such that
the normal
translation just skips them. If using a bring-your-own compiler approach
it's possible to fully replace the executable implementation with an
external object (ala the custom_dispatch sample using an external ptx
blob). There are some interactions with executable merging we do later
on that this may harm but only CUDA/SPIR-V have this issue today and it
can be fixed in a way compatible with this technique.

Progress on #12292 (need an example out of tree).
26 files changed
tree: 162412041c851f4c0ab3a6350126f28f7ef117c1
  1. .github/
  2. benchmarks/
  3. build_tools/
  4. compiler/
  5. docs/
  6. experimental/
  7. integrations/
  8. llvm-external-projects/
  9. runtime/
  10. samples/
  11. tests/
  12. third_party/
  13. tools/
  14. .bazelignore
  15. .bazelrc
  16. .bazelversion
  17. .clang-format
  18. .dockerignore
  19. .gitignore
  20. .gitmodules
  21. .pylintrc
  22. .style.yapf
  23. .yamllint.yml
  24. AUTHORS
  25. BUILD.bazel
  26. CITATION.cff
  27. CMakeLists.txt
  28. configure_bazel.py
  29. CONTRIBUTING.md
  30. LICENSE
  31. README.md
  32. WORKSPACE
README.md

IREE: Intermediate Representation Execution Environment

IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.

See our website for project details, user guides, and instructions on building from source.

CI Status

Project Status

IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!

Communication Channels

Related Project Channels

  • MLIR topic within LLVM Discourse: IREE is enabled by and heavily relies on MLIR. IREE sometimes is referred to in certain MLIR discussions. Useful if you are also interested in MLIR evolution.

Architecture Overview

IREE Architecture IREE Architecture

See our website for more information.

Presentations and Talks

  • 2021-06-09: IREE Runtime Design Tech Talk (recording and slides)
  • 2020-08-20: IREE CodeGen: MLIR Open Design Meeting Presentation (recording and slides)
  • 2020-03-18: Interactive HAL IR Walkthrough (recording)
  • 2020-01-31: End-to-end MLIR Workflow in IREE: MLIR Open Design Meeting Presentation (recording and slides)

License

IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.