Patch for nALFS
adelorenzo at dinossauro.com
Mon Dec 17 12:02:05 PST 2001
In the other hand a download of the packages would be very interesting for
local network installations. In this case the ftp site should be modified
to reflect the local server holding the files. All we would need then would
be a minimal boot disk with nALFS+mk.ext or mkreiserfs app+init scripts.
Another idea I had was to include an additional handler that could help
speed up compiling time called makej6. On multiple cpu machines the command
make 'package' MAKE="make -j x" (where x is at least 2) distributes the
make process on both cpus according to the amount especified in x. The
command make 'package' MAKE="make -j" paralellizes as much compiling as the
machine supports, but it can kill your machine.
Some packages though do not compile with 'make -j' like bash(!),ncurses,
groff, so the traditional make has to be kept.
My compile time on a 1GHz-2CPU+256RAM dropped from 1:18 to 0:53 minutes. If
anyone is interested, please let me know. I am still doing some tests and
should submit the patch for approval by the end of the week.
BTW, MAKE="make -j 2" works with one-cpu machines too. Haven't tested it
>> <archive type="local">
>> <archive type="remote">
> And where prey tell will it put the package when it's downloaded?
> What will happen if you have a modem and you lose the connection in the
> What would happen if it was too big to download in one go, say the
> kernel, for example, it is very near to my download limit on one
> connection. ( I have 2 hour timeouts on my ISP )
> I don't think that the program should do this on the fly, it would be
> slow and inefficent waiting for packages to download, when they could
> be done in parallel with other builds.
> What would work is for the parser to analyse all the package <archive>
> elements first, compile a wget list and ask you about starting the
> I also think this fits more with my META ideas, ie. stuff that is not
> really used by the parser, but by the user of the profile for
> reference. A perl script to create a wget list from the profile would
> be very easy, and alfs need not be modified and over complicated.
> Make any sense?
> Test..g Te.ting One Two Th..e
> Unsubscribe: send email to listar at linuxfromscratch.org
> and put 'unsubscribe alfs-discuss' in the subject header of the message
Unsubscribe: send email to listar at linuxfromscratch.org
and put 'unsubscribe alfs-discuss' in the subject header of the message
More information about the alfs-discuss