[PATCH] D73944: [mlir][wip] Start Shape dialect
Jacques Pienaar via Phabricator via llvm-commits
llvm-commits at lists.llvm.org
Tue Feb 4 16:14:04 PST 2020
jpienaar marked 12 inline comments as done.
jpienaar added a comment.
Updated, thanks
================
Comment at: mlir/include/mlir/Dialect/Shape/IR/Shape.h:68
+
+class ElementType : public Type::TypeBase<ElementType, Type> {
+public:
----------------
rriddle wrote:
> Can we get comments on each of these types?
This made me think about having ODS generate these from the description :)
================
Comment at: mlir/include/mlir/Dialect/Shape/IR/ShapeOps.td:88
+ `shape.value_shape` represents the value produced by an operation (this corresponds to
+ `Value*` in the compiler) and a shape. Conceptually this is a tuple of a
+ value (potentially unknown) and `shape.type`. The value and shape can either
----------------
stellaraccident wrote:
> Just "Value" now, right?
Ha, good point I had written this before that change :)
================
Comment at: mlir/include/mlir/Dialect/Shape/IR/ShapeOps.td:181
+ 1. If any inputs are unranked, output is unranked;
+ 2. Then for all 0 <= i < max rank of input shapes
+
----------------
stellaraccident wrote:
> I might just be mis-understanding, but to my reading, I'm not sure this formulation is right. I believe this would yield "left aligned" broadcasting, whereas the traditional rules for a ranked broadcast are "right aligned":
>
> ```
> OUTPUT DIM: 0 1 2
> LHS: Z Y X
> RHS: X
> ```
>
> I think there is a step before where you want to align the lower ranked input dimensions to fill towards the right, with respect to the output dims. For full generality, you may also need a dimension map (similar to xhl_hlo broadcast_dimensions attributes).
Good catch, I forgot to add that step. In the code I had used zip_longest and reverse iterator, but that doesn't lend itself to words that well :). Here I'm pursuing broadcasting as defined in numpy and used by ML frameworks (https://docs.scipy.org/doc/numpy/reference/ufuncs.html#broadcasting) but I could be wrong so please check.
Re: broadcast_dimensions attribute, while that is more general that is not the same semantics as what folks think of when broadcasting is mentioned. That is a combination of reshape & broadcast IMHO.
================
Comment at: mlir/include/mlir/IR/DialectSymbolRegistry.def:27
DEFINE_SYM_KIND_RANGE(XLA_HLO) // XLA HLO dialect
+DEFINE_SYM_KIND_RANGE(SHAPE) // Shape dialect
----------------
rriddle wrote:
> Not this revision, but we should sort these.
Yeah I was very tempted ... I'll do in NFC follow up.
Repository:
rG LLVM Github Monorepo
CHANGES SINCE LAST ACTION
https://reviews.llvm.org/D73944/new/
https://reviews.llvm.org/D73944
More information about the llvm-commits
mailing list