[Mlir-commits] [mlir] [mlir][tensor] Document `dest` operand (PR #71726)
Matthias Springer
llvmlistbot at llvm.org
Thu Nov 9 17:53:10 PST 2023
================
@@ -18,31 +18,32 @@ def Tensor_Dialect : Dialect {
let description = [{
The `tensor` dialect is intended to hold core tensor creation and
manipulation ops, which are not strongly associated with any particular
- other dialect or domain abstraction. The primary smoke test of this is ops
- that make sense for any tensor element type.
-
- We leave it to other dialects to hold the vast swath of possible
- computations one might want to do on a tensor.
-
- The `tensor` type is (for better or for worse) used to represent all kinds
- of things, and supports an open-ended set of element types. Examples:
+ other dialect or domain abstraction. The primary inclusion criteria for ops
+ in this dialect is that they make sense for any tensor element type. When
+ this is not the case, the op is left to live in other dialects. Examples of
+ element types that could be supported by the `tensor` dialect include:
- representing large, dense aggregations of primitive types, suitable for
high-performance numerical computing.
- - representing shapes in the `shape` dialect, which consist of small
- 1D tensors of `index` data type.
+ - representing shapes in the `shape` dialect, which consist of small 1D
+ tensors of `index` data type.
- representing aggregations of strings or “variant” types.
- - representing large, sparse aggregations of primitive types, suitable
- for high-performance numerical computing.
+ - representing large, sparse aggregations of primitive types, suitable for
+ high-performance numerical computing.
- Thus, for the `tensor` dialect, we prefer for now to constrain the
- scope as much as possible. The expectation is that at some point
+ Because of this broad element type support, we prefer for now to keep the
----------------
matthias-springer wrote:
I would emphasize less the "broad element type support" here (in line 34). Imo, the main reason to keep the `tensor` dialect small is because we have dedicated dialects for various MLIR concepts/functionality such as sparsity (`sparse_tensor`) or structured computations (`linalg`). And the `tensor` dialect contains only the most basic operations around tensors that do not conceptually belong into any of the other dialects.
https://github.com/llvm/llvm-project/pull/71726
More information about the Mlir-commits
mailing list