I'm starting to look into binary instruction encodings in TableGen, and I'm a bit confused on how the instruction fields are populated. Perhaps I'm just being dense, but I cannot see how SDAG operands are translated into the encoding fields. Can someone please explain the following snippet from the PPC back-end.<div>
<br></div><div>The AND instruction in PPC is defined as:</div><div><div><br></div><div><font face="courier new, monospace">1011 def AND : XForm_6<31, 28, (outs GPRC:$rA), (ins GPRC:$rS, GPRC:$rB),</font></div><div><font face="courier new, monospace">1012 "and $rA, $rS, $rB", IntSimple,</font></div>
<div><font face="courier new, monospace">1013 [(set GPRC:$rA, (and GPRC:$rS, GPRC:$rB))]>;</font></div><div><br></div><div>Okay, so rA, rS, and rB are register operands.</div><div><br></div><div>The TableGen classes are defined as:</div>
<div><br></div><div><div><font face="courier new, monospace">315 class XForm_base_r3xo_swapped</font></div><div><font face="courier new, monospace">316 <bits<6> opcode, bits<10> xo, dag OOL, dag IOL, string asmstr,</font></div>
<div><font face="courier new, monospace">317 InstrItinClass itin></font></div><div><font face="courier new, monospace">318 : I<opcode, OOL, IOL, asmstr, itin> {</font></div><div><font face="courier new, monospace">319 bits<5> A;</font></div>
<div><font face="courier new, monospace">320 bits<5> RST;</font></div><div><font face="courier new, monospace">321 bits<5> B;</font></div><div><font face="courier new, monospace">322 </font></div><div><font face="courier new, monospace">323 bit RC = 0; // set by isDOT </font></div>
<div><font face="courier new, monospace">324 </font></div><div><font face="courier new, monospace">325 let Inst{6-10} = RST;</font></div><div><font face="courier new, monospace">326 let Inst{11-15} = A;</font></div><div>
<font face="courier new, monospace">327 let Inst{16-20} = B;</font></div><div><font face="courier new, monospace">328 let Inst{21-30} = xo;</font></div><div><font face="courier new, monospace">329 let Inst{31} = RC;</font></div>
<div><font face="courier new, monospace">330 }</font></div><div><font face="courier new, monospace"><br></font></div><div><font face="courier new, monospace">337 class XForm_6<bits<6> opcode, bits<10> xo, dag OOL, dag IOL, string asmstr,</font></div>
<div><font face="courier new, monospace">338 InstrItinClass itin, list<dag> pattern></font></div><div><font face="courier new, monospace">339 : XForm_base_r3xo_swapped<opcode, xo, OOL, IOL, asmstr, itin> {</font></div>
<div><font face="courier new, monospace">340 let Pattern = pattern;</font></div><div><font face="courier new, monospace">341 }</font></div></div><div><br></div><div>Okay, so A, RST, and B are the operand fields in the instruction encoding (I assume). But where are A, RST, and B given values? When the instruction is encoded (and the physical registers are known), where do these values come from? A grep for RST doesn't come up with anything useful. Is there C++ code somewhere that scans the operands of all instructions and performs the actual encoding?</div>
<div><br></div>-- <br><br><div>Thanks,</div><div><br></div><div>Justin Holewinski</div><br>
</div>