[PATCH] D17267: [DemandedBits] Introduce a getMinimumBitWidth() member function

Hal Finkel via llvm-commits llvm-commits at lists.llvm.org
Thu May 5 07:00:29 PDT 2016


hfinkel added inline comments.

================
Comment at: include/llvm/Analysis/DemandedBits.h:72
@@ -68,2 +71,3 @@
   DenseMap<Instruction *, APInt> AliveBits;
+  DenseMap<Instruction *, APInt> SignBits;
 };
----------------
It is not clear to me that we need a keep a bitset here. We could just keep an unsigned (a running count of the number of known sign bits).

================
Comment at: lib/Analysis/DemandedBits.cpp:242
@@ +241,3 @@
+      auto *CI = cast<ConstantInt>(UserI->getOperand(1));
+      SB = SOut & ~CI->getValue();
+      SB = APInt::getHighBitsSet(BitWidth, SB.countLeadingOnes());
----------------
This logic seems non-obvious to me. Please add a comment. It looks like you're only keeping the sign bits that correspond to unset bits in the second xor operand. This is not the general case (xoring with -1 does not change the number of sign bits, for example).

================
Comment at: lib/Analysis/DemandedBits.cpp:277
@@ +276,3 @@
+  case Instruction::ICmp:
+    // Count the number of leading zeroes in each operand.
+    ComputeKnownBits(BitWidth, UserI->getOperand(0), UserI->getOperand(1));
----------------
I don't understand this logic at all. ICmp's return is the result of the comparison (i.e. an i1), and so the result is always trivial.

================
Comment at: lib/Analysis/VectorUtils.cpp:370
@@ -368,4 +369,3 @@
 
-    uint64_t V = DB.getDemandedBits(I).getZExtValue();
-    DBits[Leader] |= V;
-    DBits[I] = V;
+    APInt V1 = APInt::getLowBitsSet(BW, BW - DB.getNumSignBits(I));
+    APInt V2 = DB.getDemandedBits(I);
----------------
Why do you use getLowBitsSet here and then use ~V1 below. Should we just get some number of set high bits?


Repository:
  rL LLVM

http://reviews.llvm.org/D17267





More information about the llvm-commits mailing list