[LLVMdev] Convert the result of a vector comparison into a scalar bit mask?
Jeff Bush
jeffbush001 at gmail.com
Sun Jun 30 23:14:08 PDT 2013
When LLVM does a comparison of two vectors, in this case with 16
elements, the returned type of setcc is v16i1. The architecture I'm
targeting allows storing the result of a vector comparison as a bit
mask in a scalar register, but I'm having trouble converting the
result of setcc into a value that is usable there. For example, if I
try to AND together masks that are the results of two comparisons, it
can't select an instruction because the operand types are v16i1 and no
instructions can deal with that. I don't want to have to modify every
instruction to be aware of v16i1 as a data type (which doesn't seem
right anyway). Ideally, I could just tell the backend to treat the
result of a vector setcc as an i32. I've tried a number of things,
including:
- Using setOperationAction for SETCC to Promote and set the Promote
type to i32. It asserts internally because it tries to do a sext
operation on the result, which is incompatible.
- Using a custom lowering action to wrap the setcc in a combination of
BITCAST/ZERO_EXTEND nodes (which I could match and eliminate in the
instruction pattern). However those DAG nodes get removed during one
of the passes and I the result type is still v16i1
So, my question is: what is the proper way to convert the result of a
vector comparison into a scalar bitmask?
More information about the llvm-dev
mailing list