[PATCH] D79330: [Analyzer][VLASizeChecker] Check for VLA size overflow.

Whisperity via Phabricator via cfe-commits cfe-commits at lists.llvm.org
Fri May 15 03:02:41 PDT 2020


whisperity added a comment.

In D79330#2035850 <https://reviews.llvm.org/D79330#2035850>, @balazske wrote:

> I was looking at CERT ARR32-C <https://wiki.sei.cmu.edu/confluence/display/c/ARR32-C.+Ensure+size+arguments+for+variable+length+arrays+are+in+a+valid+range> "Ensure size arguments for variable length arrays are in a valid range". The VLA should not have a size that is larger than `std::numeric_limits<size_t>::max()`, in other words "fit into a size_t value", or not?
>
> Yes creating the too large VLA in itself is not a bug, only when `sizeof` is called on it because it can not return the correct size. A non-fatal error is a better option, or delay the check until the sizeof call. But probably the create of such a big array in itself is sign of code smell. The array actually does not need to be created to make the problem happen, only a sizeof call on a typedef-ed and too large VLA. (What does mean that "result of sizeof is implementation-defined"? Probably it can return not the size in bytes or "chars" but something other? In such a case the checker would be not correct.)




In D79330#2036340 <https://reviews.llvm.org/D79330#2036340>, @balazske wrote:

> Is it worth to improve the checker by emitting a warning only if a `sizeof` is used on a `typedef`-ed VLA type where the size is too large (and at a VLA declaration with size overflow)? Or can this be done in a later change?


The result of `sizeof` is implementation-defined because the size of types is implementation-defined in the first place (obeying a few other rules). This covers the part that the standard does not say that the result value will be 5 or 7 or 92. The **type** which `sizeof()` returns is a `size_t`, which in itself is an implementation-defined type, depending on target architecture). This is why the `result of sizeof is implementation-defined`.

However, the following is said about `size_t`:

> std::size_t can store the maximum size of a theoretically possible object of any type (including array). A type whose size cannot be represented by std::size_t is ill-formed (since C++14)

(The first sentence applies for C, too.)

I would reason from this that in case we know there is a type which cannot be reasoned about with `sizeof` then this type is bad. This conforms to what //ARR32-C// says.
The problem is that if you have an overflown sized array, at the point of allocation, the code generated will use this overflown size, which opens the door for a plethora of vulnerabilities.

I think we should aim for making the project safe and proper by eliminating bad code, instead of hunting //purely// the word of the law. If the intention visible in the code can lead to the system blowing up, let's keep the warning there where it is easiest to fix.


Repository:
  rG LLVM Github Monorepo

CHANGES SINCE LAST ACTION
  https://reviews.llvm.org/D79330/new/

https://reviews.llvm.org/D79330





More information about the cfe-commits mailing list