DaveC, $read, $write commands also include the ability to remove a specific number line of text after that variable has been read and written. One line would be all that would be needed to read at any time, that line written to another text file, a counter variable increased and then the original top line in the fat text file removed. It's also likely FASTER than using the Filter command.
It well be faster but still hugely disk intensive
When you remove the first line of the file, the file is completely rewritten minus the firstline, i beleive however, it is rewritten in large blocks (64kb or 1meg etc) rather a line at a time.
Doing so would be incredibly wastefull as its a process that is not needed.
The fastest method of all would to be use the /fopen $fread /fwrite /fclose commands, then the source file well be read a total of 1 times, and the result files well be written to a total of 1 times, how ever i just ran a test on my alias and it took 2 seconds, 2 seconds for 100,006 lines isnt to shabby.