commit | 42ee29988a228dad8db9ac46a5a5eb496c019c7a | [log] [tgz] |
---|---|---|
author | Hanhan Wang <hanchung@google.com> | Wed May 19 18:18:09 2021 +0000 |
committer | Copybara Fixup Action <iree-github-actions-bot@google.com> | Wed May 19 18:18:09 2021 +0000 |
tree | 544229df65a86715487d35c5cfd7cc78c46f48c5 | |
parent | eac55b6464b328d9264f6bb6a15f5b0fb74a4130 [diff] | |
parent | 367468b8b63ef724cd74dcb145c39b52aaf56af5 [diff] |
Merge main -> google * 367468b8b Fix tests for shift operations (#5952) * 577d9d9cc Integrate MLIR-EmitC at iml130/mlir-emitc@0b9d50b (#5936) * b5f3dda01 Add attribute to capture lowering configuration. (#5842) * 4b31bd95f Initial Adding ROCM HAL Backend to Experimental (#5943) * 3bdcea8df Merge google -> main (#5947) * 90368f18f Remove file left during renaming from LinagToNVVM to LinalgToLLVMGPU (#5940) PiperOrigin-RevId: 374687876
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler that lowers Machine Learning (ML) models to a unified IR optimized for real-time inference on mobile/edge devices against heterogeneous hardware accelerators. IREE also provides flexible deployment solutions for its compiled ML models.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
IREE is licensed under the terms of the Apache license. See LICENSE for more information.