[llvm] [AArch64][SVE] Tweak how SVE CFI expressions are emitted (PR #151677)

Sander de Smalen via llvm-commits llvm-commits at lists.llvm.org
Tue Aug 5 06:59:08 PDT 2025


================
@@ -221,6 +221,23 @@ inline uint64_t decodeULEB128AndIncUnsafe(const uint8_t *&p) {
   return decodeULEB128AndInc(p, nullptr);
 }
 
+enum class LEB128Sign { Unsigned, Signed };
+
+template <LEB128Sign Sign, typename T, typename U = char,
+          unsigned MaxLEB128SizeBytes = 16>
+inline void appendLEB128(SmallVectorImpl<U> &Buffer, T Value) {
+  static_assert(sizeof(U) == 1, "Expected buffer of bytes");
+  unsigned LEB128ValueSize;
+  U TmpBuffer[MaxLEB128SizeBytes];
+  if constexpr (Sign == LEB128Sign::Signed)
----------------
sdesmalen-arm wrote:

Fair enough, I didn't realise that the buffer might have a different signedness.

FWIW, I personally find SLEB/ULEB more convenient because (1) it's shorter, (2) there's already precedent for ULEB128/SLEB128 throughout the code-base and (3) creating an enum class for each use of a `bool` in an interface is a bit unnecessary. I also never really have issues with the one character differences myself; the code-base is already littered with cases like this.

https://github.com/llvm/llvm-project/pull/151677


More information about the llvm-commits mailing list