[LLVMdev] Standard output binary mode on windows

Julien Lerouge jlerouge at apple.com
Wed Jun 4 11:54:01 PDT 2008


Hello,

On windows, the standard output is not set to binary mode by default so
all '\n' characters are replaced with '\r\n'. This is a pain for any
command using stdout, like "llvm-as < input.ll > out.bc", because out.bc
is then most likely corrupted.

This is an old story, fixed a while ago:
http://llvm.org/bugs/show_bug.cgi?id=787

And here is the thread on LLVMDev:
http://thread.gmane.org/gmane.comp.compilers.llvm.devel/4320/focus=4322

Unfortunately, this bug is back. ChangeStdoutToBinary is never called in
the current LLVM tree. I guess the call was lost during the
Bytecode->Bitcode transition.

The attached patches fix that by calling ChangeStdoutToBinary in
WriteBitcodeToFile if the output stream is llvm::cout. This works for
all the LLVM tools, because they use llvm::cout, but it doesn't work for
llvm-gcc for example, because it is creating a new ostream from stdout,
so I have another patch for the llvm-backend. 

Is there a canonical / portable way to check if an ostream is the
standard output ?

Hope this helps,
Julien

-- 
Julien Lerouge
PGP Key Id: 0xB1964A62
PGP Fingerprint: 392D 4BAD DB8B CE7F 4E5F FA3C 62DB 4AA7 B196 4A62
PGP Public Key from: keyserver.pgp.com
-------------- next part --------------
Index: lib/Bitcode/Writer/BitcodeWriter.cpp
===================================================================
--- lib/Bitcode/Writer/BitcodeWriter.cpp	(revision 51935)
+++ lib/Bitcode/Writer/BitcodeWriter.cpp	(working copy)
@@ -23,6 +23,7 @@
 #include "llvm/TypeSymbolTable.h"
 #include "llvm/ValueSymbolTable.h"
 #include "llvm/Support/MathExtras.h"
+#include "llvm/System/Program.h"
 using namespace llvm;
 
 /// These are manifest constants used by the bitcode writer. They do not need to
@@ -1292,6 +1293,10 @@
   // Emit the module.
   WriteModule(M, Stream);
   
+  // If writing to stdout, set binary mode.
+  if (llvm::cout == Out)
+      sys::Program::ChangeStdoutToBinary();
+
   // Write the generated bitstream to "Out".
   Out.write((char*)&Buffer.front(), Buffer.size());
   
-------------- next part --------------
Index: gcc/llvm-backend.cpp
===================================================================
--- gcc/llvm-backend.cpp	(revision 51935)
+++ gcc/llvm-backend.cpp	(working copy)
@@ -51,6 +51,7 @@
 #include "llvm/Support/Streams.h"
 #include "llvm/Support/ManagedStatic.h"
 #include "llvm/Support/MemoryBuffer.h"
+#include "llvm/System/Program.h"
 #include <cassert>
 #undef VISIBILITY_HIDDEN
 extern "C" {
@@ -272,6 +273,10 @@
   PerModulePasses = new PassManager();
   PerModulePasses->add(new TargetData(*TheTarget->getTargetData()));
 
+  // If writing to stdout, set binary mode.
+  if (asm_out_file == stdout)
+    sys::Program::ChangeStdoutToBinary();
+
   // Emit an LLVM .bc file to the output.  This is used when passed
   // -emit-llvm -c to the GCC driver.
   PerModulePasses->add(CreateBitcodeWriterPass(*AsmOutStream));
@@ -473,6 +478,10 @@
     // wrong for llvm/.bc emission cases.
     flag_no_ident = 1;
 
+  // If writing to stdout, set binary mode.
+  if (asm_out_file == stdout)
+    sys::Program::ChangeStdoutToBinary();
+
   AttributeUsedGlobals.clear();
   timevar_pop(TV_LLVM_INIT);
 }


More information about the llvm-dev mailing list