<div dir="ltr">Hi all,<div><br></div><div style>I have some ideas for GSOC projects using LLVM, where should I post them?</div><div style><br></div><div style>Idea #1: llvm-env</div><div style><br></div><div style>A simple tool, with little or no dependency on LLVM itself, that will investigate a target architecture by probing hardware, software, libraries and compiling and executing code to identify all properties that would be relevant to command-line options, triple settings etc.</div>
<div style><br></div><div style>The first stage is to build a CFLAGS for Clang that would produce code on the current Host to the identified Target.</div><div style><br></div><div style>The second stage would be to produce a configuration file (that can be used independently of the Host) so that Clang can read it and not need a gazillion of command-line options. Such file should be simple JSON / INI or anything that Vim could change.</div>
<div style><br></div><div style><br></div><div style>Idea #2: LNT perf Monitor</div><div style><br></div><div style>The LNT perf database has some nice features like detect moving average, standard deviations, variations, etc. But the report page give too much emphasis on the individual variation (where noise can be higher than signal).</div>
<div style><br></div><div style>The first part of the project would be to create an analysis tool that would track moving averages and report:</div><div style> * If the current result is higher/lower than the previous moving average by more than (configurable) S standard deviations</div>
<div style> * If the current moving average is more than S standard deviations of the Base run</div><div style> * If the last A moving averages are in constant increase/decrease of more than P percent</div><div style><br>
</div><div style>The second part would be to create a web page which would show all related benchmarks (possibly configurable, like a dashboard) and show the basic statistics with red/yellow/green colour codes to show status and links to more detailed analysis of each benchmark</div>
<div style><br></div><div style>A possible third part would be to be able to automatically cross reference different builds, so that if you group them by architecture/compiler/number of CPUs, this automated tool would understand that the changes are more common to one particular group.</div>
<div style><br></div><div style>cheers,</div><div style>--renato</div></div>