[cfe-dev] gpucc breaks cuda 7.0.28/7_CUDALibraries/simpleCUFFT

Jingyue Wu via cfe-dev cfe-dev at lists.llvm.org
Tue Apr 5 14:12:19 PDT 2016


Would you mind upload your simpleCUFFT.cu code? It looks related to device
code generation because building was successful.

On Tue, Apr 5, 2016 at 1:56 PM, Peter Steinbach <steinbac at mpi-cbg.de> wrote:

> Hi guys,
>
> first of all, please accept my apologies for contacting you by mail. I was
> a bit lost, which mailing list to choose from as pointed to by
> http://llvm.org/docs/CompileCudaWithLLVM.html
> and the subsequent
> http://llvm.org/docs/#mailing-lists
> Feel free to deflect this request to the relevant mailing list or bug
> tracker.
>
> In any case, I am very interested in using GPUCC in favor of NVCC for a
> multitude of reasons (C++1X, compilation speed, ...). I started to "port"
> my favorite samples from the nvidia SDK.
> With clang 3.8, samples-7.0.28/7_CUDALibraries/simpleCUFFT as compiled
> with clang produces an error at runtime! Here is what I see with a K20c:
>
> $ clang++ --cuda-path=/sw/apps/cuda/7.0.28   -I../../common/inc  -m64
> --cuda-gpu-arch=sm_35 --cuda-gpu-arch=sm_35 -o simpleCUFFT.o -c
> simpleCUFFT.cu
> $ clang++ --cuda-path=/sw/apps/cuda/7.0.28    -L/sw/apps/cuda/7.0.28/lib64
> -lcudart -ldl -lrt -pthread  -m64      -o simpleCUFFT.llvm simpleCUFFT.o
> -lcufft
> $ ./simpleCUFFT.llvm
> [simpleCUFFT] is starting...
> GPU Device 0: "Tesla K20c" with compute capability 3.5
>
> Transforming signal cufftExecC2C
> Launching ComplexPointwiseMulAndScale<<< >>>
> simpleCUFFT.cu(132) : getLastCudaError() CUDA error : Kernel execution
> failed [ ComplexPointwiseMulAndScale ] : (8) invalid device function.
>
> The same source code works just fine with nvcc 7.0.
> Any help would be appreciated.
>
> Best,
> Peter
>
> PS. From random comments, I had the feeling that you are looking at the
> SHOC benchmarks with gpucc. If so, please comment on:
> https://github.com/vetter/shoc/issues/48
> I don't wanna do work that is either pointless (support for textures) or
> was already done. ;)
> --
> Peter Steinbach, Dr. rer. nat.
> HPC Developer, Scientific Computing Facility
>
> Max Planck Institute of Molecular Cell Biology and Genetics
> Pfotenhauerstr. 108
> 01307 Dresden
> Germany
>
>
> phone +49 351 210 2882
> fax   +49 351 210 1689
> www.mpi-cbg.de
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.llvm.org/pipermail/cfe-dev/attachments/20160405/dac51e09/attachment.html>


More information about the cfe-dev mailing list