[LLVMdev] UPDATE: AUtomake Difficulties (Long)

Reid Spencer reid at x10sys.com
Wed Oct 20 10:53:57 PDT 2004


On Wed, 20 Oct 2004 at 12:29:38 -0500, Misha Brukman wrote:

 > On Wed, Oct 20, 2004 at 01:01:33AM -0700, Reid Spencer wrote:
 > > Instead of spending a bunch more time on trying to get automake to work,
 > > I suggest we just fix the current makefile system to do what automake
 > > can do. Specifically we need to:
 > [snip]
 > > I am, of course, soliciting feedback on this whole idea.

 > I would agree that, given the differences, it is better to improve the
 > current system to do what automake can do than to switch to automake and
 > teach automake how to do things that our current build system already
 > does (which in some cases, as you mention, may not be reasonable).

Yes, that was my thinking too. When I started looking at the list of things 
necessary to make automake usage comfortable, it started looking like about the 
same amount of work to make our current system have the things that automake 
doesn't.

 > I am not sufficiently familiar with dependency generation, et al, to
 > comment on it in detail, but I would *love* a "make check" facility with
 > results listed in plain-text files rather than a database that required
 > running qmtest, logging into it via a web browser, and updating the
 > binary DB that way.

Dependency generation is done automatically by GCC with the -MM and related 
options. These options generate file.po when file.c is compiled to file.o and 
contains the dependencies of file.o on all its sources (headers). What automake 
does is generate this file AND compile the code in the same invocation of GCC. 
What our current makefiles do is first generate the dependencies by using the 
-MM option and then invoke GCC again to compile the code. I believe this is 
where the extra time is going. By combining the two operations (dependency 
generation and compilation) we will reduce the I/O of building by 50%.

I also agree that we need a "make check" and that this should not be restricted 
to the "test" directory. There may be several "sanity checks" that you want to 
do early in the build. For example, in lib/System, I would write some very 
simple tests that make sure the functionality provided by lib/System actually 
works. There's no point waiting 30 minutes into a build to find out that 
something very fundamental is broken.  This would, of course, be optional so 
these kind of unit tests would be employed only where it makes sense. When the 
build gets around to the "test" directory, all the feature and regression tests 
would automatically be run only using text inputs and outputs.

 > And if there's a "make dist" and/or "make rpm" target(s), so much
 > the better.

Those will require a bit more work, but it can be done incrementally. As I'm 
interested in producing a commercial version of LLVM, these are very important 
to me. I want to ENSURE that when unpacked, a distribution will actually work 
(its not missing files, the configuration is right, etc.).  Its actually the 
"distcheck" target that interests me as it builds a distribution and then 
automatically tests it by unpacking the tgz, configuring it, building it, and 
running "make check". If all that passes, then the tgz file is considered 
distributable. This one feature is what I was originally after with automake 
because getting this right is tricky. However, I'm probably just going to do it 
once with automake and copy what it produces into our makefiles.

Reid.




More information about the llvm-dev mailing list