commit | 8f9e962e60edb19786d41362968bec33797bab2a | [log] [tgz] |
---|---|---|
author | Ben Vanik <ben.vanik@gmail.com> | Thu Jun 08 08:44:18 2023 -0700 |
committer | GitHub <noreply@github.com> | Thu Jun 08 08:44:18 2023 -0700 |
tree | e084967a37980268ae7d4d4d832c0482033b3a5c | |
parent | aced620654d52092ed383d56cb1c6e153cd72ba9 [diff] |
Adding support for async memory pool allocations in the CUDA HAL. (#13440) These aren't actually async as the CUDA HAL is synchronous but will make use of CUDA's memory pooling features to reduce the alloc/free cost in a way more friendly to CUDA's memory management than just the normal allocator. With this our queue-ordered allocations in CUDA now average a few us (for me) and the caching allocator (or any other) is only needed for caching non-queue-ordered allocations. A few compiler tweaks to switch all allocations to queue-ordered will mean only explicitly allocated buffers (constants/variables, stuff the user does manually, etc) will not route to the pools. It'd also be possible to explore using the same pools the queue-ordered allocations use for explicit synchronous allocations (at least the device-local pool) but it'd be nicer to get those out of the critical path and then keep the pools separate such that the transient pool isn't filled with persistent allocations. Due to #13984 this relies on the `--iree-stream-emulate-memset` flag added in #13994 being set when graphs are enabled. Since this is not the default path today and there's just two test suites using it we just flip the flag for them.
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
See our website for more information.
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.