[Mlir-commits] [mlir] Fix MLIR Transform Tutorial Doc (PR #155285)
llvmlistbot at llvm.org
llvmlistbot at llvm.org
Mon Aug 25 13:11:26 PDT 2025
llvmbot wrote:
<!--LLVM PR SUMMARY COMMENT-->
@llvm/pr-subscribers-mlir
Author: Brandon Kirincich (BrandonKi)
<details>
<summary>Changes</summary>
Fixes a small issue I noticed while reading through the tutorial.
---
Full diff: https://github.com/llvm/llvm-project/pull/155285.diff
1 Files Affected:
- (modified) mlir/docs/Tutorials/transform/Ch0.md (+1-1)
``````````diff
diff --git a/mlir/docs/Tutorials/transform/Ch0.md b/mlir/docs/Tutorials/transform/Ch0.md
index dc4b753f98caa..0d7a70364742d 100644
--- a/mlir/docs/Tutorials/transform/Ch0.md
+++ b/mlir/docs/Tutorials/transform/Ch0.md
@@ -134,7 +134,7 @@ Furthermore, the operation now contains a region that explicitly specifies the m
## “Loop” Fusion
-Since the region of the `linalg.generic` operation can contain arbitrarily many operations, we can use it to express “fusion” of the implicit loops by simply having more operations chained in the region. For example, the common machine learning rectified linear unit layer (ReLU), which can be defined as `relu(x) = max(0, x)`, can be defined be expressed using the “compare-and-select” idiom in one `linalg.generic` operation, without the temporary buffer for the comparison result and without repeating the outer operation:
+Since the region of the `linalg.generic` operation can contain arbitrarily many operations, we can use it to express “fusion” of the implicit loops by simply having more operations chained in the region. For example, the common machine learning rectified linear unit layer (ReLU), which can be defined as `relu(x) = max(0, x)`, can be expressed using the “compare-and-select” idiom in one `linalg.generic` operation, without the temporary buffer for the comparison result and without repeating the outer operation:
```mlir
linalg.generic {
``````````
</details>
https://github.com/llvm/llvm-project/pull/155285
More information about the Mlir-commits
mailing list