[cfe-dev] libc++ std::map self-assignment bug

Kal b17c0de at gmail.com
Mon Jan 13 22:56:50 PST 2014


Here is a patch to resolve this issue. Please apply :-)

-K

Am 29.11.13 09:56, schrieb Kal:
> Am 29.11.13 04:17, schrieb Howard Hinnant:
>> On Nov 28, 2013, at 9:58 PM, Karen Shaeffer <shaeffer at neuralscape.com> wrote:
>>
>>> On Fri, Nov 29, 2013 at 02:19:44AM +0100, Kal wrote:
>>>> Hi Howard, etc,
>>>>
>>>> libc++3.4 (rc1/trunk) std::map doesn't support self-assignment for std <
>>>> c++11. The relevant code is:
>>>>
>>>>    _LIBCPP_INLINE_VISIBILITY
>>>>    map& operator=(const map& __m)
>>>>        {
>>>> #if __cplusplus >= 201103L
>>>>            __tree_ = __m.__tree_;
>>>> #else
>>>>            __tree_.clear();
>>>>            __tree_.value_comp() = __m.__tree_.value_comp();
>>>>            __tree_.__copy_assign_alloc(__m.__tree_);
>>>>            insert(__m.begin(), __m.end());
>>>> #endif
>>>>            return *this;
>>>>        }
>>>>
>>>> Maybe should be like:
>>>>
>>>>    _LIBCPP_INLINE_VISIBILITY
>>>>    map& operator=(const map& __m)
>>>>        {
>>>> #if __cplusplus >= 201103L
>>>>            __tree_ = __m.__tree_;
>>>> #else
>>>>            if (this != &__m) {
>>>>                __tree_.clear();
>>>>                __tree_.value_comp() = __m.__tree_.value_comp();
>>>>                __tree_.__copy_assign_alloc(__m.__tree_);
>>>>                insert(__m.begin(), __m.end());
>>>>            }
>>>> #endif
>>>>            return *this;
>>>>        }
>>>>
>>>> I see the same issue with unordered_map& operator=(const unordered_map&
>>>> __u).
>>>>
>>>> Thanks!
>>>> Kal
>>> --- end quoted text ---
>>>
>>> Hi Kal,
>>> I don't speak for Howard or anyone else. But my understanding is the stl library
>>> generally doesn't perform checks like that with the goal of best possible performance.
>>> I see a lot of code in stdlibc++ just like that.
>> Actually this does look like a bug to me (for  __cplusplus < 201103L).  I think Kal has the correct fix.  A post-condition of copy assignment is that both lhs and rhs should be equivalent to the previous value of rhs.  For move assignment this is not the case.  But this is copy assignment.
>>
>> Howard
>>
> Note: This issue is present in at least: map, multimap, unordered_map,
> unordered_multimap.
>
> Kal

-------------- next part --------------
Index: include/map
===================================================================
--- include/map	(revision 199190)
+++ include/map	(working copy)
@@ -884,10 +884,12 @@
 #if __cplusplus >= 201103L
             __tree_ = __m.__tree_;
 #else
-            __tree_.clear();
-            __tree_.value_comp() = __m.__tree_.value_comp();
-            __tree_.__copy_assign_alloc(__m.__tree_);
-            insert(__m.begin(), __m.end());
+            if (this != &__m) {
+                __tree_.clear();
+                __tree_.value_comp() = __m.__tree_.value_comp();
+                __tree_.__copy_assign_alloc(__m.__tree_);
+                insert(__m.begin(), __m.end());
+            }
 #endif
             return *this;
         }
@@ -1616,10 +1618,12 @@
 #if __cplusplus >= 201103L
             __tree_ = __m.__tree_;
 #else
-            __tree_.clear();
-            __tree_.value_comp() = __m.__tree_.value_comp();
-            __tree_.__copy_assign_alloc(__m.__tree_);
-            insert(__m.begin(), __m.end());
+            if (this != &__m) {
+                __tree_.clear();
+                __tree_.value_comp() = __m.__tree_.value_comp();
+                __tree_.__copy_assign_alloc(__m.__tree_);
+                insert(__m.begin(), __m.end());
+            }
 #endif
             return *this;
         }
Index: include/unordered_map
===================================================================
--- include/unordered_map	(revision 199190)
+++ include/unordered_map	(working copy)
@@ -831,12 +831,14 @@
 #if __cplusplus >= 201103L
         __table_ = __u.__table_;
 #else
-        __table_.clear();
-        __table_.hash_function() = __u.__table_.hash_function();
-        __table_.key_eq() = __u.__table_.key_eq();
-        __table_.max_load_factor() = __u.__table_.max_load_factor();
-        __table_.__copy_assign_alloc(__u.__table_);
-        insert(__u.begin(), __u.end());
+        if (this != &__u) {
+            __table_.clear();
+            __table_.hash_function() = __u.__table_.hash_function();
+            __table_.key_eq() = __u.__table_.key_eq();
+            __table_.max_load_factor() = __u.__table_.max_load_factor();
+            __table_.__copy_assign_alloc(__u.__table_);
+            insert(__u.begin(), __u.end());
+        }
 #endif
         return *this;
     }
@@ -1567,12 +1569,14 @@
 #if __cplusplus >= 201103L
         __table_ = __u.__table_;
 #else
-        __table_.clear();
-        __table_.hash_function() = __u.__table_.hash_function();
-        __table_.key_eq() = __u.__table_.key_eq();
-        __table_.max_load_factor() = __u.__table_.max_load_factor();
-        __table_.__copy_assign_alloc(__u.__table_);
-        insert(__u.begin(), __u.end());
+        if (this != &__u) {
+            __table_.clear();
+            __table_.hash_function() = __u.__table_.hash_function();
+            __table_.key_eq() = __u.__table_.key_eq();
+            __table_.max_load_factor() = __u.__table_.max_load_factor();
+            __table_.__copy_assign_alloc(__u.__table_);
+            insert(__u.begin(), __u.end());
+        }
 #endif
         return *this;
     }


More information about the cfe-dev mailing list