[LLVMdev] [PATCH / PROPOSAL] bitcode encoding that is ~15% smaller for large bitcode files...

Sean Silva silvas at purdue.edu
Wed Sep 26 12:44:14 PDT 2012

If you look at his spreadsheet [1] in the OP, he did test this case.
He even has a column showing the improvement difference uncompressed
vs. compressed. Even after compression with gzip, there is still a
noticeable difference.

--Sean Silva

[1] https://docs.google.com/spreadsheet/ccc?key=0AjRrJHQc4_bddEtJdjdIek5fMDdIdFFIZldZXzdWa0E

On Wed, Sep 26, 2012 at 4:47 AM, David Chisnall
<David.Chisnall at cl.cam.ac.uk> wrote:
> On 26 Sep 2012, at 01:08, Jan Voung wrote:
>> I've been looking into how to make llvm bitcode files smaller.  There is one simple change that appears to shrink linked bitcode files by about 15%
> Whenever anyone proposes a custom compression scheme for a data format, the first question that should always be asked is how does it compare to using a generic off-the-shelf compression algorithm.
> For comparison, if we just use the DEFLATE algorithm (via gzip) on bitcode files, we seem to see a saving of about 35% in a quick-and-dirty test.  If you apply your compression scheme (effectively a form of LZW, but only applied to a subset of the values), does this reduce the efficiency of a general purpose solution?  Does it make more sense than just applying DEFLATE to the bitcode when it's written to the disk?  The other advantage of separating the compression from the encoding, of course, is that it's easier to parallelise, as a fairly coarse-grained dataflow model can be used when streaming to and from compressed bitcode.
> David
> _______________________________________________
> LLVM Developers mailing list
> LLVMdev at cs.uiuc.edu         http://llvm.cs.uiuc.edu
> http://lists.cs.uiuc.edu/mailman/listinfo/llvmdev

More information about the llvm-dev mailing list