<table border="1" cellspacing="0" cellpadding="8">
    <tr>
        <th>Issue</th>
        <td>
            <a href=https://github.com/llvm/llvm-project/issues/86392>86392</a>
        </td>
    </tr>

    <tr>
        <th>Summary</th>
        <td>
            MLIR paths for toy2 are wrong?
        </td>
    </tr>

    <tr>
      <th>Labels</th>
      <td>
            mlir
      </td>
    </tr>

    <tr>
      <th>Assignees</th>
      <td>
      </td>
    </tr>

    <tr>
      <th>Reporter</th>
      <td>
          nyck33
      </td>
    </tr>
</table>

<pre>
    I did 
```
oot@75f04c342bbd:~/workspace/llvm-project# build/bin/toyc-ch2 mlir/test/
Examples/Toy/Ch2/codegen.toy -emit=mlir -mlir-print-debuginfo
module {
  toy.func @multiply_transpose(%arg0: tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":4:1), %arg1: tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":4:1)) -> tensor<*xf64> {
    %0 = toy.transpose(%arg0 : tensor<*xf64>) to tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":5:10)
    %1 = toy.transpose(%arg1 : tensor<*xf64>) to tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":5:25)
    %2 = toy.mul %0, %1 : tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":5:25)
    toy.return %2 : tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":5:3)
  } loc("mlir/test/Examples/Toy/Ch2/codegen.toy":4:1)
  toy.func @main() {
    %0 = toy.constant dense<[[1.000000e+00, 2.000000e+00, 3.000000e+00], [4.000000e+00, 5.000000e+00, 6.000000e+00]]> : tensor<2x3xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":9:17)
    %1 = toy.reshape(%0 : tensor<2x3xf64>) to tensor<2x3xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":9:3)
    %2 = toy.constant dense<[1.000000e+00, 2.000000e+00, 3.000000e+00, 4.000000e+00, 5.000000e+00, 6.000000e+00]> : tensor<6xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":10:17)
    %3 = toy.reshape(%2 : tensor<6xf64>) to tensor<2x3xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":10:3)
    %4 = toy.generic_call @multiply_transpose(%1, %3) : (tensor<2x3xf64>, tensor<2x3xf64>) -> tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":11:11)
    %5 = toy.generic_call @multiply_transpose(%3, %1) : (tensor<2x3xf64>, tensor<2x3xf64>) -> tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":12:11)
    toy.print %5 : tensor<*xf64> loc("mlir/test/Examples/Toy/Ch2/codegen.toy":13:3)
    toy.return loc("mlir/test/Examples/Toy/Ch2/codegen.toy":8:1)
  } loc("mlir/test/Examples/Toy/Ch2/codegen.toy":8:1)
} loc(unknown)
root@75f04c342bbd:~/workspace/llvm-project# 
``` 
but the documentation shows something different at https://mlir.llvm.org/docs/Tutorials/Toy/Ch-2/

```
Complete Toy Example [¶](https://mlir.llvm.org/docs/Tutorials/Toy/Ch-2/#complete-toy-example)

We can now generate our “Toy IR”. You can build toyc-ch2 and try yourself on the above example: toyc-ch2 test/Examples/Toy/Ch2/codegen.toy -emit=mlir -mlir-print-debuginfo. We can also check our RoundTrip: toyc-ch2 test/Examples/Toy/Ch2/codegen.toy -emit=mlir -mlir-print-debuginfo 2> codegen.mlir followed by toyc-ch2 codegen.mlir -emit=mlir. You should also use mlir-tblgen on the final definition file and study the generated C++ code.

At this point, MLIR knows about our Toy dialect and operations. In the [next chapter](https://mlir.llvm.org/docs/Tutorials/Toy/Ch-3/), we will leverage our new dialect to implement some high-level language-specific analyses and transformations for the Toy language.
```
You either have to prepend `mlir/` to the path to the `codegen.toy` if executing the command from the project root or something else.  

</pre>
<img width="1px" height="1px" alt="" src="http://email.email.llvm.org/o/eJzMWE1v4zYT_jX0ZWBDIiXHPvjgjzUQ4H0viwWKnhYUNZLYUKRAUnF06W8vSH_Eie3tdhOjDQLBlMmZ55lnZkiaOydrjbgg-YrkmxHvfWPsQg_iibFRYcph8QilLIEkG5IsyTQ5_MehMZ5kyUNeJZlgGS2KkrDln4Rud8Y-uY4LJHSr1HM77qz5A4UnlEHRS1USui2kJnTrzSDGoqHQKmnDGJ0ndLu3_-WFt51CR-j2mxkI3a4bSuhWmBJr1BNvBhhjKz1hm7AcxuE57qzUflxi0ddSV2ZvqjVlrxDIw2o_BvBmmFS9FkCypO2Vl50avnvLteuMQ0JnhObc1glhS_ConbGErQldvlTTjLAvoIyIk-hb5H-PmVBK2DIjbJkSOid0DXtP6d09zWEc7F3zcRYYCHgSIGwTY3QlJHADafDgzT045IFDEkicg0x_ADL9d0DS_D1IegLZ9iqG9iD5TYR3wBHcW_S91UdId_PMzhyTh82nZe-1suWhh8yCorfzVxjtPNceStQOA93Q6lbpJIl_SOgqiZLQizfs7Zt8E5XLV9nFzPzizfRibb6JhXYeefrCPiny8xCmh9sFYtE1vDuUR3ILxfva-Fx87HZpXBPpFxSia_g1bd4LM_0k2mlyVRd2XRd6HcT9VInwLmTJTuhq1Gil-C64Uj_aJ9NDS2OxFNkSCJ1dTa_1ray7uTF9lGG6f7yjmP9jiuzYtf97FOklxcAsnoOOZO_S7lN2kT5nG80Hjc_eNf5P2Evemnw12OsnbXb69I39pYPtuyPyYVz0HnyDUBrRt6g999JocI3ZOXCmRd9IXUMpqwotag_cQ-N95whbhnMw3Qaqk-BrYmxN6LY0IhLtvbGSq3PSY3o6Ol89r69NCJNH-GYGOAQt7GdkTclqGve32QedUyYOTsbeDGPcO3mNeXz-hiC4Bm12EMuPewTTWyBfKJklZL4O-B6_HoebCfxu-rgkXh7gdGngugRvBxhMbx2qCoyOweaFeUY4Og_Jf1zx0xnzEzeLCRyIcOUMiAbFU6Tx1fS6_GZldzfPQEP1HtfEeZVRyuywhGJ49flmxrnZfURdY3pV7uH3DuMdbOwLVaM-RrKSmisosZJaxsytpMIYd-f7cohzjhqWsCZ0Regq-p2cK74MRSAddEZqH3rk___3-BVC1bkgVu9j4ILspeQKhY8uTBfsSqPdBB73eEi-0vjiQTS882g_mrMsLomXsB3CTioFCp_R8nqfkRp3J0TegAzShSqOpQuNrJtxmK9AcV33vMax61DISgrgmqvBoTskKdeuMrbd04HK2EgnMD4unVyt2aATSt-ghYY_Y0DRWexQl0CmyaEPhnYTTggNQsd9c_xMpsl5D5wmICvAFxS9D00nTBGmbQPAypp2v37f0CD0QDD2rEehcjiBQ1sblQtWztmcj3CRPqRpOp-nD3TULETBaCFEmXCazmiRsaQoKCYVzeYFpaIayQVNaJYwylKWTFM6waIq0iLN8nKW5kWSkizBlkt10nAknetxMZuyOR0pXqBy8feK4zZASb4Z2UXsx0VfO5IlSjrvXg146RUuYs6F-BzibwYK3CLsrNE1YdtRb9XibS7V0jd9MRGmPTT8i76_jeBCUkV8fwUAAP__txcs6A">