[Tokenizer] Fix rstrip flag: remove match rejection, add whitespace consumption (#23805) ## Summary `flags_allow_match()` treated `rstrip=true` as a match condition, rejecting tokens not followed by whitespace or end of input. Per [HuggingFace's implementation(https://github.com/huggingface/tokenizers/blob/main/tokenizers/src/tokenizer/added_vocabulary.rs), `rstrip` is a post-processing directive that consumes trailing whitespace after the match, not a match gate. This caused special tokens like `<|user|>` in Phi-4-mini to fall through to BPE subword tokenization when followed by text. **Example** (Phi-4-mini-instruct, token `<|user|>` has `rstrip=true`): Input: "<|user|>Hello" Before fix: [27, 91, 1428, 91, 29, 13225] ← <|user|> split into subwords After fix: [200021, 13225] ← matches HuggingFace output ## Changes - Remove the rstrip rejection branch from `flags_allow_match()` in `special_tokens.c` - Add rstrip whitespace consumption in `tokenizer.c` after a successful match - Add `matched_flags` field to encode state to propagate flags to the caller ## Test results ### HuggingFace smoketest (`huggingface_smoketest.py`) **1667/1667** tokenization comparisons pass across ~80 HuggingFace models (0 mismatches). 76 additional tests fail to **load** (not tokenization mismatches) — these are tiktoken models listed in the HF smoketest that aren't valid HuggingFace model identifiers. They're tested by the dedicated tiktoken smoketest instead. ### Tiktoken smoketest (`tiktoken_smoketest.py`) **72/76** tokenization comparisons pass across 4 tiktoken encodings (cl100k_base, o200k_base, r50k_base, p50k_base). 4 failures — **identical between upstream and this PR** (pre-existing): | Encoding | Failing test | Cause | |----------|-------------|-------| | cl100k_base | `special_token_endoftext` | IREE matches `<\|endoftext\|>` as special token; tiktoken's `encode_ordinary` treats it as literal text | | o200k_base | `special_token_endoftext` | Same | | r50k_base | `special_token_endoftext` | Same | | p50k_base | `special_token_endoftext` | Same | Root cause: the IREE tokenizer has no "encode ordinary" mode (equivalent to tiktoken's `disallowed_special=()`). Signed-off-by: Jorn <jorn.tuyls@gmail.com> Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
IREE (Intermediate Representation Execution Eenvironment, pronounced as “eerie”) is an MLIR-based end-to-end compiler and runtime that lowers Machine Learning (ML) models to a unified IR that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments.
See our website for project details, user guides, and instructions on building from source.
Releases notes are published on GitHub releases.
| Package | Release status |
|---|---|
| GitHub release (stable) | |
| GitHub release (nightly) | |
iree-base-compiler | |
iree-base-runtime |
For more details on the release process, see https://iree.dev/developers/general/release-management/.
| Operating system | Build status |
|---|---|
| Linux | |
| macOS | |
| macOS |
For the full list of workflows see https://iree.dev/developers/general/github-actions/.
See our website for more information.
Community meeting recordings: IREE YouTube channel
| Date | Title | Recording | Slides |
|---|---|---|---|
| 2025-06-10 | Data-Tiling in IREE: Achieving High Performance Through Compiler Design (AsiaLLVM) | recording | slides |
| 2025-05-17 | Introduction to GPU architecture and IREE's GPU CodeGen Pipeline | recording | slides |
| 2025-02-12 | The Long Tail of AI: SPIR-V in IREE and MLIR (Vulkanised) | recording | slides |
| 2024-10-01 | Unveiling the Inner Workings of IREE: An MLIR-Based Compiler for Diverse Hardware | recording | |
| 2021-06-09 | IREE Runtime Design Tech Talk | recording | slides |
| 2020-08-20 | IREE CodeGen (MLIR Open Design Meeting) | recording | slides |
| 2020-03-18 | Interactive HAL IR Walkthrough | recording | |
| 2020-01-31 | End-to-end MLIR Workflow in IREE (MLIR Open Design Meeting) | recording | slides |
IREE is licensed under the terms of the Apache 2.0 License with LLVM Exceptions. See LICENSE for more information.