commit | 4395a123bfc709ff2c5203b5c9407a43f7ee5c26 | [log] [tgz] |
---|---|---|
author | Geoffrey Martin-Noble <gcmn@google.com> | Thu Nov 04 07:36:47 2021 -0700 |
committer | GitHub <noreply@github.com> | Thu Nov 04 10:36:47 2021 -0400 |
tree | eba33268b467e8f465b5ec80af26a4be05a74904 | |
parent | fe23053138277ed5b3fda6f7474d2fd665e4c65c [diff] |
Enable benchmarks from TFLite flatbuffers on benchmark pipeline (#7494) This replaces the three benchmarks derived from TFLite flatbuffers to use the original flatbuffer source rather than a pre-compiled MLIR file. These benchmarks are updated in place since the original source is exactly the same. Using flatbuffers avoids a few issues: 1. The MLIR IR formats do not offer any backward compatibility guarantees. When MLIR syntax changes in a backward-incompatible way, the benchmark pipeline breaks. 2. Updates to the frontend compilation are not captured in real time. They are only incorporated as part of the updates above, which means that they are realized only when updating the MLIR files due to syntax changes. This adds a dependency on the TFLIte frontend to benchmark generation. In the automated pipeline, the additional Bazel compilation uses the remote cache and should therefore usually be fast (as it shares the cache with the normal CI run). In local generation, users will have to either build `iree-import-tflite` from source or fetch it as part of a release. Given that we would like to move all benchmarks to originate from stable formats, I've made trying to run a TFLite benchmark without `IREE_BUILD_TFLITE_COMPILER` set an error rather than a warning that skips the benchmark. This explicit check just fails slightly earlier and more clearly, as the `TARGET_FILE` lookup for `iree-import-tflite` would fail subsequently.
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
See our website for more information.
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.