commit | 55e4c2ce7d6c144d9fd54466896f39e702c29dfe | [log] [tgz] |
---|---|---|
author | Scott Todd <scotttodd@google.com> | Mon May 16 09:48:52 2022 -0700 |
committer | GitHub <noreply@github.com> | Mon May 16 09:48:52 2022 -0700 |
tree | 9a8d78dc57df924f2292f9efb33f0bc0343b0d36 | |
parent | 31fd07157fef2c3cf5dee727685e69fdfde3f36c [diff] |
Add initial benchmarking features to WebAssembly experiments. (#9100) ### Background I'm working towards identifying and quantifying key metrics for our web platform support. Metrics of interest thus far include: * runtime binary size * program binary size * runtime startup time * program load time * total function call time (as seen from a JavaScript application) * no-overhead function call time (`iree_runtime_call_invoke()`) Once we have baselines for those, we can start deeper analysis and optimization work. Some optimization work may be purely in the IREE compiler (e.g. codegen, Flow/Stream/HAL dialects, etc.), while other work may be in the web port itself (e.g. Emscripten flags, use of web APIs, runtime JS bindings). ### Current Status This is still all experimental, but I'd like to checkpoint what I've built so far and let other people try it out. Expect rough edges (e.g. some Windows/Linux paths specific to my setup that can be overwritten). Summary of changes: * `sample_dynamic/index.html` now supports interactive benchmarking * a "benchmark iterations" form input drives a loop around `iree_runtime_call_invoke()` down in C/Wasm * timing information is rendered onto the page itself, instead of being logged to stdout / the console * sample screenshot: https://user-images.githubusercontent.com/4010439/167958736-228a1541-8ed6-4b2c-9af9-55ef0b10bf74.png * `generate_web_metrics.sh` imports and compiles programs from our existing benchmark suite, preserving all sorts of artifacts for further manual inspection or automated use * `run_native_benchmarks.sh` executes the compiled native programs from `generate_web_metrics.sh` * `sample_dynamic/benchmarks.html` loads and runs each compiled Wasm program from `generate_web_metrics.sh` * Sample output: https://gist.github.com/ScottTodd/f2bb1f274c5895c8f979400abd6d2b67 Also of note: I haven't really looked at the browser profiling tools yet. I expect those will help with detailed analysis while the general scaffolding and `performance.now()` measurements will help with comparisons between frameworks and across browsers/devices.
IREE (Intermediate Representation Execution Environment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
IREE is still in its early phase. We have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics. It is still quite far from ready for everyday use and is made available without any support at the moment. With that said, we welcome any kind of feedback on any communication channels!
See our website for more information.
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.