<table border="1" cellspacing="0" cellpadding="8">
<tr>
<th>Issue</th>
<td>
<a href=https://github.com/llvm/llvm-project/issues/127273>127273</a>
</td>
</tr>
<tr>
<th>Summary</th>
<td>
[CI] Metrics upload crashing occasionally
</td>
</tr>
<tr>
<th>Labels</th>
<td>
new issue
</td>
</tr>
<tr>
<th>Assignees</th>
<td>
boomanaiden154
</td>
</tr>
<tr>
<th>Reporter</th>
<td>
boomanaiden154
</td>
</tr>
</table>
<pre>
The metrics job seems to be crashing occasionally (about five times since it was deployed about a week or so ago). The traceback from the latest crash:
```
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 748, in _error_catcher
yield
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 1209, in read_chunked
chunk = self._handle_chunk(amt)
^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 1146, in _handle_chunk
value = self._fp._safe_read(amt) # type: ignore[union-attr]
^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/http/client.py", line 642, in _safe_read
raise IncompleteRead(data, amt-len(data))
http.client.IncompleteRead: IncompleteRead(3634 bytes read, 6606 more expected)
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/requests/models.py", line 820, in generate
yield from self.raw.stream(chunk_size, decode_content=True)
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 1057, in stream
yield from self.read_chunked(amt, decode_content=decode_content)
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 1189, in read_chunked
with self._error_catcher():
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
self.gen.throw(value)
File "/usr/local/lib/python3.12/site-packages/urllib3/response.py", line 775, in _error_catcher
raise ProtocolError(f"Connection broken: {e!r}", e) from e
urllib3.exceptions.ProtocolError: ('Connection broken: IncompleteRead(3634 bytes read, 6606 more expected)', IncompleteRead(3634 bytes read, 6606 more expected))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "//metrics.py", line 184, in <module>
main()
File "//metrics.py", line 169, in main
current_metrics = get_metrics(github_repo, workflows_to_track)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "//metrics.py", line 53, in get_metrics
workflow_run = next(workflow_runs)
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/PaginatedList.py", line 84, in __iter__
newElements = self._grow()
^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/PaginatedList.py", line 95, in _grow
newElements = self._fetchNextPage()
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/PaginatedList.py", line 320, in _fetchNextPage
headers, data = self.__requester.requestJsonAndCheck(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/Requester.py", line 586, in requestJsonAndCheck
return self.__check(*self.requestJson(verb, url, parameters, headers, input, self.__customConnection(url)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/Requester.py", line 856, in requestJson
return self.__requestEncode(cnx, verb, url, parameters, headers, input, encode)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/Requester.py", line 977, in __requestEncode
status, responseHeaders, output = self.__requestRaw(cnx, verb, url, requestHeaders, encoded_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/Requester.py", line 1011, in __requestRaw
response = cnx.getresponse()
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/github/Requester.py", line 202, in getresponse
r = verb(
^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/sessions.py", line 746, in send
r.content
File "/usr/local/lib/python3.12/site-packages/requests/models.py", line 902, in content
self._content = b"".join(self.iter_content(CONTENT_CHUNK_SIZE)) or b""
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/requests/models.py", line 822, in generate
raise ChunkedEncodingError(e)
requests.exceptions.ChunkedEncodingError: ('Connection broken: IncompleteRead(3634 bytes read, 6606 more expected)', IncompleteRead(3634 bytes read, 6606 more expected))
```
Putting this as low priority as k8s will automatically restart the pod if it crashes, so this is barely noticeable.
XRef the internal bug: b/381693531
</pre>
<img width="1" height="1" alt="" src="http://email.email.llvm.org/o/eJzcWV1v47rR_jXMDRFBpixLvvCF10nes2_b7WKbAkVvBIoaS1xTpEpScdxfX5CiZMfHuz3dxDmLYxiwLXOGfObjmdGIGsNrCbBC6QdESKlUSyXlFchZOkeEoPTuhva2UXr18r-bUlWH1WMDuAWrOTP4qyqxAWgNtgqXgJmmpuGyxooxariSVIgDRiSnpeot3vInwJa3YLDhkgHmFu-pwRV0Qh2gwsMyivcAO6w0NgrTWiGyjLDb1mrKoKRsh7datdg2gAW1YOywMUrWKF6jRRze8fpxEkAkb5WxWAMDaTGjQmBBjUVkOYhh_MAFYIefPPRGI_IgFKPCffISkYfuYBslk2jmFhhu4bajbEdrME5AC8HLBJEHDaZT0kDUHbyuDRZcAs7mufvOJS5Aa6ULRi1rQPudMT5wENX1TzEj8TIcQwOtCtb0cgdVOIT_hVFyhw2IbVQ0VFYChkXOh623VrzGKL1_g_f10c7mi9HoL7AMcJ-o6OEE7raLCkO3UDjTTHgxRiTB9tABStaY11JpQOmHXnIlb6m1GqV3QeH4ehvz_ICFGms7RB6Y4CDtmTEWczLa4ggyXmNNuQH8UTLVdgIsfBnAV9RSt5629laAnK4shxBwO0VhnzPZZP1rbckimePyYMHgwbgbvFjEC9wqDRieO2AWqkGzy9oGHBM8ub8YdJYr6WnC5XvFNTCXv70BrLb-2lYJofaOdqb1gQrekQE0_KsHY93XVlUgzJn9cxIH-9cgQVMLp6k_EJqPQ033kbEaaItI7uO1MPzf4IQrYKqCgilpQVqU3D3qHkJOXjeT4jQLpw9H-8bZT0klZNCFY59dGABcmwvy7zDfntsmsMBLdib5MTzeO5u9eZ6t4OU5lnSqJQU8c1sUAYZHUIOMbKPVHpHcU9y72DfL0u_Vt4FlPmtlFVPi3q1AJN8iQjZKSmA-x0utduBSF6PsAyAy0yi7C5s4FEOgubwJx4mmfDfRS91Oh_NddlH9j_MT8WnwCvlAcXe9dnzlq5LvlwYmO2M9T8BS2Qb0CRUqxnqtoXo1xzmqGhq58wDL58GZKNm0quoFoOQ-uLKlXIa8-M36FmPueeHQbTgQ0hZjL-kKcQ3Tb0TymtumLwsNnXLie6V3W6H2prCqcK3g7k3bkbfN7m_aIk2mMnDEGlgoACx0L705JDxbRPLT6-bVmF9LBYNXEHn4TGsuqYXqz9ycdxtTABUFt6AnhpKwvxfQgrTmpPOqB7YaQ-r89VthXR3VcuI4f-LvQdqCZc0neLafaQ1HbO9dQ34IZjK1KmcwBsAN0Aq08ZWdWnoCughNEOgofPt_o-RaVpsGXLbmv0e2vpnJvkzgzjI6X0ydxa9Bh_oHttdyNBMbzUHWoW2a5FzRBl06hb0W7qOjmrZgg8VPjM9l1_v-atTaG6vaY8FDJPcqllPh-TmY8nd8vxVJfCsU8vRCKFwMgfD_vXSdsGv05bOT_J99D0HBH9u91_bbMsumevXSM6GxttT23vBj-_vL0Reqt11vL9DgF7r_pmfDkhM1gyerIrj2j-HQq5PvLJ7Nzl3n7D7m3OAt7xwmn6Ma7HjtdUX56sBITI6N4nTmAMvjGSIqP2mZ3vqQJzMNA8b4-6yzqdLpKS8QXdDgbU3-7_4xyIYkQGSNyHq3p7p-fV_7ru7775ZJT-YNgw2OIXnkCgPS3T92GrqjQdzF4r2scn1LZNMU1qMNZojG6c-1B2_LKURPdwzDknEG5T1SejESfVX-Dtcv8Hcv06Aq3_z10-P9p8di88vfP_2p-NvHf94PzRVWehT_-Xj7-qNNcnm0OUx9NsOozRdULutx-BOallH56SznosTPPdI5eeiE4vXn3loua2wbbjA1WKg97jRXmtuD-73LDd5zITDtrWqp5cw_KNNgLNXWj4I6VWG-xTw82wLfIxg1qOQGl1SDOGCpLGdASwHRsPU_vsAwS-LSgpZU4LKvnYWcm5N8tlgmaTK7qVZJtUyW9AZWsyxZplm6iJc3zYqmKc1mbJFWcZpUwBZZDNuUzeY5SVnG0hu-IjFJYzKbk4Qs5stoW8bbMpuzZQwsSWcpmsfQUi4iIZ7aSOn6hhvTw2pGMpIlN4KWIEx49ihhj_2_4bGjXjmh27KvDZrHgru4mNRYboV_aLn5iNI7_JcwN-o7oWh1-dHjTa_FqrG2MyhZD1OZofBGTLUu-MXT-HHbafUVmEXkwZ_IRXs48tOK_CcAAP__dfWSVA">