<table border="1" cellspacing="0" cellpadding="8">
    <tr>
        <th>Issue</th>
        <td>
            <a href=https://github.com/llvm/llvm-project/issues/142450>142450</a>
        </td>
    </tr>

    <tr>
        <th>Summary</th>
        <td>
            Building LLVM offloading needs (and can't find) ROCm
        </td>
    </tr>

    <tr>
      <th>Labels</th>
      <td>
            new issue
      </td>
    </tr>

    <tr>
      <th>Assignees</th>
      <td>
      </td>
    </tr>

    <tr>
      <th>Reporter</th>
      <td>
          FlorianB-DE
      </td>
    </tr>
</table>

<pre>
    Hello everyone, 
I'm trying to get llvm OpenMP Offloading to work for nvidia GPUs in a container. 

This is the Dockerfile:
```Dockerfile
FROM nvidia/cuda:12.6.3-devel-ubuntu24.04

WORKDIR /root

# for ROCm (didn't work either)
# RUN apt update && apt install -y wget
# RUN wget https://repo.radeon.com/amdgpu-install/6.4.1/ubuntu/noble/amdgpu-install_6.4.60401-1_all.deb
# RUN apt install -y ./amdgpu-install_6.4.60401-1_all.deb
# RUN apt update && apt install -y python3-setuptools python3-wheel && \
#     rm -rf /var/lib/apt/lists/*
# RUN usermod -a -G render,video
# RUN apt update && apt install -y rocm libdrm-dev && \
#     rm -rf /var/lib/apt/lists/*

# Install llvm build dependencies.
RUN apt update && \
     apt install -y --no-install-recommends ca-certificates cmake 2to3 python-is-python3 \
 subversion ninja-build python3-yaml git && \
     rm -rf /var/lib/apt/lists/*

ADD https://github.com/llvm/llvm-project.git /llvm-project

RUN mkdir /llvm-project/build

WORKDIR /llvm-project/build

RUN cmake ../llvm -G Ninja                  \
   -C ../offload/cmake/caches/Offload.cmake \
   -DCMAKE_BUILD_TYPE=Release \
   -DCMAKE_INSTALL_PREFIX=/usr/local/llvm   \
 -DLLVM_TARGETS_TO_BUILD="X86;NVPTX"

RUN ninja install
```
For those following (gpu-offloading-docker-image)[https://discourse.llvm.org/t/gpu-offloading-docker-image/86656], this seems to be unique to Docker. 
The Provided Docker Image in /llvm/utils/docker doesn't work either (it's probably on an old version) as also discussed in the thread.
</pre>
<img width="1" height="1" alt="" src="http://email.email.llvm.org/o/eJykVd-PmzgQ_muclxGImECSBx6yy9KLur-Upr3eU2TwENw1NmebrPLfnwxkd9vtVeodikSwxzPfzOf5hlkrjgoxI8kVSfIZ612jTVZIbQRTV0F-Mys1P2d_oJQa8ITmrBUSeg0k2mwJXbbgzFmoIzgNR3Qg5amFhw7V3SM81LXUjE-7z9o8Qa0NqJPggsGHx88WhAIGlVaOCYUm9F5JtNk3woKw4BqEXFdPaGohkcR-j6TR-HuzEW2K3cPd5JjQouo5I_FmTsM0jAOOJ5RBX_bK9XQRRosxyJ8Pu4_5dgeEFkZrNy4SGg8Ydw_XLRC64oIrQpduRI_CNWgIXU-Wu8_3wDoHfceZQyA0JTQdVoSyjkkJwRmej-je2PtPaJzrrM-HFj48djo0jKNWYaVbQgvW8mPXB5MXQos0XIRzQosxC0ILpUuJ7ywP3i6NFtE8mB-YlCHH8gesb5CF_-X8L3Ptzq7RKg4sur5zWkv7svTcIMrLMZJcT279Y1oITO2JODFDaCFF6ZF1bvhvnR3KtHkDpLdoWs0hYBB8AIOKe1quT4Kj_h28RlctSFFy0_pr8r_hjYe2U4ChF8peSA4cOw9SVQJtSKLNz9GNcYegPwANAqUvJAUGK922qLiFigUVGidqUTGHFqqWPSFQp-Op8IGwwUTBxb3tyxMaK7QCJdQ3FowQL0SdWSvhKNxPQP1WJTZ5_sM9PwrX9OV0xX1xplfQGf0NKxeOUb9bG335crVPXJh327QY0L9r6X838r7GMoXhZOgv0b0vBbx7XrIPrkd7PYqaVxnvxL9Z1aBPfdK7cPT-ejK_vtt8vDlcfd7e5of9X483JM53KJHZn1ht7z_tN7e3h8fdTbH9SuLcd70dqq0rJi-IX5AF-e3tl7vDfrP7cLP_dNg_jHGGg_TrKiXx1f2Xx_1XQulr_gPvl_v1VlS9lGoDrtEWodZS6mcv34SuvEroF0EP-KC-gWjZEb0gJlffc82FrXRvLIYebqjNkdDCM_FLP8UqTZOUJLmfMM5PAYvYWj8-SoReib979B-j9g_jYt8gPBrtW59P67D13vxseb1nvRPSczTGA67RvhN2n6ZwhC4tdEaXrJRn0AqYAi05TD1D6BqYBSatBp9kby1yH8sPK9cYZDyc8Szm63jNZpjNl4vVarWM6HrWZIwl1ZIu03k5p0mKC16nuCirZM0YjWg5n4mMRjSJ0ojOl8kyWocVi1IeLyoa11GNrCaLCFsm5EtZZ8LaHrP5gi6SaCZZidIO05xShc8w7Hruk3xmsqEryv5oySIaGvbVjRNOYnbl-8Qz7i8VvPIECpFbXyCmOFRsrF0tFPf18NNy1huZ_V6_E1oM8DwvE_5TRv8JAAD__yR2rr4">