[llvm-dev] COAT: an EDSL making just-in-time code generation easier

Frank Tetzel via llvm-dev llvm-dev at lists.llvm.org
Fri Sep 20 01:03:03 PDT 2019


Hi Stefan,

> Cool project. Thanks for sharing! It reminds me of
> https://github.com/BitFunnel/NativeJIT

Looks interesting and indeed a bit similar. Thanks for the link.


> DSLs appear to be the way to go for now and it totally makes sense to
> me. My concern is the "cardinality vs. barrier of entry" balance: the
> more powerful a DSL, the harder it is to learn vs. the more
> specialized the DSL, the less use-cases it has. Maybe a narrow domain
> is the key? :)

My main concern is readability. If one mixes C++ with EDSL code, it is
sometimes hard to distinguish which variable is what. Is it part of the
function I want to generate or some pre-calculation? I was sometimes
falling back to naming conventions which is less then ideal.

Otherwise, I fully agree. It feels much nicer to nested expressions
with normal operators than just a sequence of API calls.


> The alternative approach is embedded bitcode from static C++ like:
> https://github.com/jmmartinez/easy-just-in-time It appears very
> powerful, but it's also much more complicated. Also, it can't deal
> with arbitrary C++ either, so I wonder how much benefit it has in
> practice.

Right, how could I forget about Easy::jit? It's a nice approach with a
simple interface. ClangJIT is arguably even a bit more elegant by
defering template instantiations to the runtime and therefore has a
better language integration.
https://arxiv.org/abs/1904.08555

Both approaches have the problem of "large" compilation latency. For
example, modern databases create LLVM IR themselves to avoid the
overhead of Clang. And they are still facing latency problems for short
running queries. COAT just tries to make the IR generation easier. With
the second backend AsmJit, compilation latency is very low.

Best regards,
Frank


More information about the llvm-dev mailing list