[llvm-dev] ORC v2 question
Dibyendu Majumdar via llvm-dev
llvm-dev at lists.llvm.org
Wed Aug 14 15:22:56 PDT 2019
Hi Lang,
On Wed, 14 Aug 2019 at 21:53, Dibyendu Majumdar <mobile at majumdar.org.uk> wrote:
> > > CodeGen optimization seems a more likely culprit: JITTargetMachineBuilder and ExecutionEngineBuilder have different defaults for their CodeGen opt-level. JITTargetMachineBuilder defaults to CodeGenOpt::None, and ExecutionEngineBuilder default to CodeGenOpt::Default.
> > >
> > > What happens if you make the following modification to your setup?
> > >
> > > auto JTMB = llvm::orc::JITTargetMachineBuilder::detectHost();
> > > JTMB->setCodeGenOptLevel(CodeGenOpt::Default); // <-- Explicitly set Codegen opt level
> > > auto dataLayout = JTMB->getDefaultDataLayoutForTarget();
> >
> > I am not sure what to make of that. What happens if you print TM->getOptLevel() right before running CodeGen? Once your have explicitly set it I would expect them to be the same for ORCv1 and ORCv2. If they're not then it's a plumbing issue.
> >
>
> I explicitly set TM->Options anyway so maybe I don't need this?
>
I think I finally got ORC v2 working. And I had to set
JTMB->setCodeGenOptLevel(CodeGenOpt::Default);
I don't quite understand how this interacts with everything else.
Regards
Dibyendu
More information about the llvm-dev
mailing list