commit | 9055c9d1f6342a061a6747ef9b385816b96a0a8f | [log] [tgz] |
---|---|---|
author | Andrew Woloszyn <andrew.woloszyn@gmail.com> | Thu Jan 09 11:29:59 2025 -0500 |
committer | GitHub <noreply@github.com> | Thu Jan 09 11:29:59 2025 -0500 |
tree | 24989e860da4a7c7e0e68bd23b01874b4bea8476 | |
parent | 82e37d66886275dc638beb995d3066c54705706b [diff] |
[hip] Fix race in the cleanup of queue read operations. (#19645) Fix a race when we are looping data chunks to read. If we dispatch to the secondary thread, and have the cleanup execute before we hit the next loop iteration (the next LoC) we will read deallocated memory. This also retains the file for the duration of the read operation. It was not causing any issues, but seems prudent either way. Signed-off-by: Andrew Woloszyn <andrew.woloszyn@gmail.com>
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
Releases notes are published on GitHub releases.
Package | Release status |
---|---|
GitHub release (stable) | |
GitHub release (nightly) | |
Python iree-base-compiler | |
Python iree-base-runtime |
Operating system | Build status |
---|---|
Linux | |
macOS | |
Windows |
For the full list of workflows see https://iree.dev/developers/general/github-actions/.
See our website for more information.
Community meeting recordings: IREE YouTube channel
Date | Title | Recording | Slides |
---|---|---|---|
2021-06-09 | IREE Runtime Design Tech Talk | recording | slides |
2020-08-20 | IREE CodeGen (MLIR Open Design Meeting) | recording | slides |
2020-03-18 | Interactive HAL IR Walkthrough | recording | |
2020-01-31 | End-to-end MLIR Workflow in IREE (MLIR Open Design Meeting) | recording | slides |
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.