commit | 2d70891a998d509ffdbcc5842fc595461d853c5c | [log] [tgz] |
---|---|---|
author | Scott Todd <scott.todd0@gmail.com> | Thu Aug 01 16:14:57 2024 -0700 |
committer | GitHub <noreply@github.com> | Thu Aug 01 16:14:57 2024 -0700 |
tree | 42a465df27252e06b3123e4d7856e1a8669f1416 | |
parent | cddcd5b2eac99a0f6407bab3347e4c61fc6f3cb7 [diff] |
Use ccache in runtime builds. (#18089) This uses https://github.com/hendrikmuhs/ccache-action to set up https://ccache.dev/ together with https://docs.github.com/en/actions/writing-workflows/choosing-what-your-workflow-does/caching-dependencies-to-speed-up-workflows to create cache entries at https://github.com/iree-org/iree/actions/caches and then pull them down at the start of each build job. * Right now the cache keys are just `${{ github.job }}-${{ matrix.name }}`, which will create entries named like `ccache-build_test_runtime-ubuntu-20.04-2024-08-01T22:19:46.787Z`. * We can turn off the timestamps with `append-timestamp: false` if we find that our 10GB cache limit gets overrun enough to have workflows get cache misses (PkgCI already produces 800MB cache entries, so we're always floating way over that limit and old entries continually get evicted). * We could do something more sophisticated like put the LLVM commit hash in the cache entry name, but that shouldn't matter for runtime builds 🤞 * The Windows build in particular has large binaries (mainly HAL CTS tests). Reducing the size there will speed up link time during building and hopefully also trim the cache size. * I'm matching the `write-caches` logic we have in other workflows for now, which should maintain cache integrity (only reviewed + merged code writes to the cache) while also limiting the number of entries we generate. ## Sample job run https://github.com/ScottTodd/iree/actions/runs/10206588680/job/28239799198 * Load from cache:  * Save to cache:  * View cache files (https://github.com/ScottTodd/iree/actions/caches?query=key%3Accache):  --- Also * dropped support for using Docker containers in the build/test runtime jobs - just install and configure clang etc. as needed * enabled LLD for linking ci-exactly: build_test_runtime
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
See our website for more information.
Community meeting recordings: IREE YouTube channel
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.