<html><head><meta http-equiv="Content-Type" content="text/html charset=utf-8"></head><body style="word-wrap: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space;" class="">Hi Matt,<div class=""><br class=""></div><div class="">You seem to favour the usage of std::size_t because it does not "limit the range of values". My original post shows that the usage of std::size_t does not even allow std::vector to have a size() >= PTRDIFF_MAX in all the 3 major standard library implementation.</div><div class=""><br class=""></div><div class="">- Just run the following code compiled on a 32 bit system (or with -m32) with libc++</div><div class=""> auto v = std::vector<char>( );</div><div class=""> std::cout << PTRDIFF_MAX << " " << SIZE_MAX << " " << v.max_size() << std::endl;</div><div class=""> and you’ll find out that v.max_size() is equal to PTRDIFF_MAX and not SIZE_MAX.</div><div class="">- libstdc++ and the Windows standard library implementation returns SIZE_MAX and it you look at my original post you’ll understand that this is a bug (size() has undefined behaviour if the size is > PTRDIFF_MAX). This bug has been in their standard library implementation for 20 years!</div><div class=""><br class=""></div><div class="">François</div><div class=""><br class=""></div><div class=""><div><blockquote type="cite" class=""><div class="">On 18 Feb 2015, at 04:09, Matt Calabrese <<a href="mailto:rivorus@gmail.com" class="">rivorus@gmail.com</a>> wrote:</div><div class=""><div dir="ltr" class=""><div class="gmail_extra"><div class="gmail_quote"><div class=""><br class=""></div><blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left-width:1px;border-left-color:rgb(204,204,204);border-left-style:solid;padding-left:1ex"><div style="word-wrap:break-word" class=""><div class="">2) You get another bit and therefore you can address more memory.</div><div class=""> - Maybe, for the people who needs to work with arrays of chars of more than 2 GB on 32 bit systems...</div></div></blockquote><div class=""><br class=""></div><div class="">This genuinely can be a problem and I think it's reasonable on its own as a rationale for unsigned, though really it's the artificial change in the range of values if you were to adopt a signed type that is much more upsetting to me, personally, even though it's much more ideological.</div></div></div></div></div></blockquote></div></div></body></html>