[clang] [llvm] [DirectX] Set Shader Flag DisableOptimizations (PR #126813)

Farzon Lotfi via cfe-commits cfe-commits at lists.llvm.org
Wed Feb 12 07:31:08 PST 2025


================
@@ -212,20 +230,24 @@ PreservedAnalyses ShaderFlagsAnalysisPrinter::run(Module &M,
 bool ShaderFlagsAnalysisWrapper::runOnModule(Module &M) {
   DXILResourceTypeMap &DRTM =
       getAnalysis<DXILResourceTypeWrapperPass>().getResourceTypeMap();
+  const ModuleMetadataInfo MMDI =
+      getAnalysis<DXILMetadataAnalysisWrapperPass>().getModuleMetadata();
 
-  MSFI.initialize(M, DRTM);
+  MSFI.initialize(M, DRTM, MMDI);
   return false;
 }
 
 void ShaderFlagsAnalysisWrapper::getAnalysisUsage(AnalysisUsage &AU) const {
   AU.setPreservesAll();
   AU.addRequiredTransitive<DXILResourceTypeWrapperPass>();
+  AU.addRequired<DXILMetadataAnalysisWrapperPass>();
----------------
farzonl wrote:

This change causes DXIL Module Metadata analysis pass to run before DXIL Shader Flag Analysis because it is now  needed  `MMDI.EntryPropertyVec[0].Entry->hasFnAttribute(
            llvm::Attribute::OptimizeNone);`  to initialize ModuleShaderFlags.  

Does that mean we were previously going to fetch fn attributes a different way and it just turns out `hasFnAttribute` in ModuleMetadataInfo was easier?

https://github.com/llvm/llvm-project/pull/126813


More information about the cfe-commits mailing list