commit | 9ca7563d903d8e6f04e9f456f0cf1cf70546d506 | [log] [tgz] |
---|---|---|
author | Scott Todd <scotttodd@google.com> | Wed Apr 13 10:47:11 2022 -0700 |
committer | GitHub <noreply@github.com> | Wed Apr 13 10:47:11 2022 -0700 |
tree | 96ce003e60c7e83ccced9d4425d2ea9aae2e43ff | |
parent | 519815e4bcd51cf690ede07338d73d99241465b7 [diff] |
Make benchmark artifact generation a bit more flexible. (#8846) Here's my use case: * I'm working on Windows, where we don't support building the TensorFlow integration tools (notably `iree-import-tflite`) * I want to use the .mlir files that we feed into our benchmarking for local development (more specifically I want _some_ set of real programs, but the benchmark suite offers a set that we continuously build/test and have metrics for already) --- Now, a sensible solution would be to extend https://github.com/google/iree/blob/1ddd9170bd6cc2701e2d669754a8fff248739233/build_tools/buildkite/cmake/android/arm64-v8a/benchmark2.yml#L9-L14 to also zip up the imported .mlir files so anyone could download them from Buildkite. --- I, however, had the more indirect idea to write a Colab notebook ([work in progress here](https://colab.research.google.com/gist/ScottTodd/4b27c6b44f90d239d4676dacddb87895/iree-benchmarks-import-demo.ipynb)) that installs the latest IREE release along with the project source and then builds `iree-benchmark-suites` using the installed tools. That _almost_ worked, except for the few issues that this PR patches over: * `$<TARGET_FILE:iree::tools::iree-compile>` only works when building with `IREE_BUILD_COMPILER`, but it's also useful to point at already built tools using `IREE_HOST_BINARY_ROOT`. The `iree_get_executable_path` function handles switching between those modes easily * Compiling imported files for the full list of benchmark configurations can be slow. If a developer just wants to import .mlir files then skipping those following steps would be useful. I added a new `iree-benchmark-import-models` target that does that. (I also considered the _even more indirect_ idea to parse the artifact paths from [benchmarks/TFLite/CMakeLists.txt](https://github.com/google/iree/blob/main/benchmarks/TFLite/CMakeLists.txt) and run `iree-import-tflite` manually, but the `iree-benchmark-suites` target already does that natively :P) --- I'm open to any workflow that solves my original use case, but these changes seem generally good to have anyways.
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
See our website for more information.
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.