| commit | 9b946d7ffaf08f7f6130e6aacb3692e53456eb45 | [log] [tgz] |
|---|---|---|
| author | MaheshRavishankar <1663364+MaheshRavishankar@users.noreply.github.com> | Fri Jun 04 11:23:22 2021 -0700 |
| committer | GitHub <noreply@github.com> | Fri Jun 04 11:23:22 2021 -0700 |
| tree | 594ced673770d29879a8970cc672e2264e539428 | |
| parent | dd61184cc87e242c38ca0141551e6d2a56d568c6 [diff] |
Drop usage of LLVMCodegenOptions. (#6084) The use of LLVM style command line flags makes it harder to print out reproducers or have a proper string representation of the pass pipeline. It is better to use MLIR wrappers around these options. With dynamic pass pipelines, there is a point to anchor these options. Some of the options are now moved into specific passes. like the unfusedFMAOps option is part of the ConvertToLLVM pass. Some unnecessary tests are deleted. Also enable use of vectorization pipeline on the VMVX backend. Since the VMVX backend cannot handle vectors or vector operations, add a flag to avoid vectorization but still do the tiling for the different cache levels. Fixes #5925
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.