Boston Linux & Unix (BLU) Home | Calendar | Mail Lists | List Archives | Desktop SIG | Hardware Hacking SIG
Wiki | Flickr | PicasaWeb | Video | Maps & Directions | Installfests | Keysignings
Linux Cafe | Meeting Notes | Blog | Linux Links | Bling | About BLU

BLU Discuss list archive


[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

make -j optimization



David Kramer writes:

> I was trying to compile Rogue Wave.  There are about 20 or 30 .o target
> files that get build from .cpp files.  Then there's a library that gets
> built from the .o files.  Even though the makefile correctly had the .c
> files as dependencies for the .o files, and the .o files as dependencies
> for the library, it tried to "ar" the library together before all the .o 
> files were created, resulting in about half of them being "not found".
>
> That blows.  How could they get that wrong?  If it's not going to be smart 
> enough to try building a target until its dependencies are built, then at 
> least only parallelize the commands to build one target at a time.

I guess gmake is valiantly trying to circumvent Amdahl's Law.  (-:

Just for the record, I've noticed this behavior for a long time now.
I recall that a friend of mine even reported this as a bug.  If there
is a standard workaround, I'd like to know what it is.  It is for this
reason that I tend to never use -j, especially in a production build.

(An even worse problem than what you describe occurs when "ar" is
invoked when all of the .o files technically exist, but are still
being written to by other compilation processes.  The result is a
"successful" build that is actually corrupt...)

Regards,

--kevin
-- 
ar tv god




BLU is a member of BostonUserGroups
BLU is a member of BostonUserGroups
We also thank MIT for the use of their facilities.

Valid HTML 4.01! Valid CSS!



Boston Linux & Unix / webmaster@blu.org