[Mesa-dev] radeonsi: LLVM r187139 broke some piglit tests

Michel Dänzer michel at daenzer.net
Thu Aug 15 02:55:36 PDT 2013


On Fre, 2013-08-02 at 17:58 +0200, Michel Dänzer wrote:
> On Mit, 2013-07-31 at 08:42 -0700, Tom Stellard wrote:
> > On Wed, Jul 31, 2013 at 01:04:01PM +0200, Michel Dänzer wrote:
> > > 
> > > LLVM revision 187139 ('Allocate local registers in order for optimal
> > > coloring.') broke some derivative related piglit tests with the radeonsi
> > > driver. 
> > > 
> > > I'm attaching a diff between the bad and good generated code (as printed
> > > with RADEON_DUMP_SHADERS=1) for the glsl-derivs test. The only
> > > difference I can see is in which registers are used in which order.
> > > 
> > > I wonder if we might be missing S_WAITCNT after DS_READ/WRITE
> > > instructions in some cases, but I haven't spotted any candidates for
> > > that in the bad code which aren't there in the good code as well. Can
> > > anyone else spot something I've missed?
> > 
> > Shouldn't we be using the S_BARRIER instruction to keep the threads in sync?
> 
> Doesn't seem to help unfortunately, but thanks for the good suggestion.

I found one thing going wrong: DS_WRITE_B32 ends up using a VGPR
register number instead of the $gds operand for encoding the GDS field
(the asm output from llc even shows the VGPR name). If the VGPR number
happens to be odd (i.e. to have the least significant bit set), the
shader ends up writing to GDS instead of LDS.

But I have no idea why this is happening, or how to fix it. :(


-- 
Earthling Michel Dänzer           |                   http://www.amd.com
Libre software enthusiast         |          Debian, X and DRI developer




More information about the llvm-commits mailing list