[llvm-dev] Performance metrics with LLVM

Matthias Braun via llvm-dev llvm-dev at lists.llvm.org
Wed Jul 5 08:48:27 PDT 2017



> On Jul 4, 2017, at 2:02 AM, Tobias Grosser <tobias.grosser at inf.ethz.ch> wrote:
> 
>> On Tue, Jul 4, 2017, at 09:48 AM, Kristof Beyls wrote:
>> Hi Tobias,
>> 
>> The metrics that you can collect in LNT are fixed per "test suite".
>> There are 2 such "test suite"s defined in LNT at the moment: nts and
>> compile.
>> For more details on this, see
>> http://llvm.org/docs/lnt/concepts.html#test-suites.
>> 
>> AFAIK, If you need to collect different metrics, you'll need to define a
>> new "test suite". I'm afraid I don't really know what is needed for that.
>> I'm guessing you may need to write some LNT code to do so, but I'm not
>> sure. Hopefully Matthias or Chris will be able to explain how to do that.
>> 
>> We probably should investigate how to make it easier to define new
>> "test-suite"s more easily. Or at least make it easier to record different
>> sets of metrics more easily, without having to change the LNT code or a
>> running LNT server instance.
>> The question on recording a different set of metrics has come up on this
>> list before, so it seems like it's an issue people do run into from time
>> to time.
> 
> Hi Kristof,
> 
> thanks for your fast reply. This is a very helpful summary that confirms
> my current understanding in parts. I never run the "compile" test suite,
> so I am not sure how much of the statistics interface is used by it (if
> at all). I somehow had the feeling something else might exist, as the
> cmake test-suite
> runner dumps some of the statistics to stdout. Would be interested to
> read if Chris or Matthias have more insights.

I often run the testsuite without LNT. Lit -o dumps the output to a json file. If your goal is just some A/B testing (rather than tracking continuously with CI systems) then something simple like test-suite/utils/compare.py is enough to view and compare lit result files.

For future LNT plans:

You also asked at an interesting moment: I am polishing a commit to LNT right now that makes it easier to define custom schemas or create new ones. Though that is only part of the solution, as even with the new schema the runner needs to be adapted to actually collect/transform all values.

I think we also will not start collecting all the llvm stats by default in the current system; with a few thousand runs in the database its slightly sluggish already, I don't think adding 10x more metrics to the database helps there. Of course once it is easier to modify schemas you could setup special instances with extended schemas that maybe track fewer instances/runs.

- Matthias

> Best,
> Tobias
> 
>> Thanks,
>> 
>> Kristof
>> 
>> 
>> On 4 Jul 2017, at 08:27, Tobias Grosser
>> <tobias.grosser at inf.ethz.ch<mailto:tobias.grosser at inf.ethz.ch>> wrote:
>> 
>> Dear all,
>> 
>> I wanted to gather LLVM statistics with lnt and found a nice flag, but
>> am unsure how such statistics can be made available in the LNT web
>> interface.
>> 
>> --cmake-define=TEST_SUITE_COLLECT_STATS=ON
>> 
>> which allows me to gather all the LLVM "-stats" output. On top of this I
>> see that the LNT cmake test-suite also dumps code-size statistics when
>> running, that look as follows:
>> 
>> size: 10848
>> size..bss: 48
>> size..comment: 218
>> size..ctors: 16
>> size..data: 4
>> size..dtors: 16
>> size..dynamic: 416
>> size..dynsym: 168
>> size..eh_frame: 172
>> size..eh_frame_hdr: 44
>> 
>> I can find all these statistics in a file called:
>> 
>> /scratch/leone/grosser/base/sandbox/test-2017-07-04_06-14-43/outputTd2xPU.json
>> 
>> but they do not appear in:
>> 
>> /scratch/leone/grosser/base/sandbox/test-2017-07-04_06-14-43/report.json
>> 
>> and in fact do not seem to be submitted to the LNT server.
>> 
>> Matthias added support for TEST_SUITE_COLLECT_STATS a while ago, but I
>> am unsure how it is expected to be used. A google search did not find
>> any relevant documentation. Is anybody using this feature today?
>> 
>> Best,
>> Tobias
>> 


More information about the llvm-dev mailing list