Matt,
That routine took awhile to run, and produced 20 files with nothing but the 5,000 line numbers in each of them, no data whatsoever from the original file was transfered over. I need the data transfered to be intact, no extra information added such as line numbers etc.
alias hugefiltertest {
var %i = 1, %ticks = $ticks
if (!$isdir) { mkdir test }
.fopen -no bigfile test\bigfile.txt
while (%i <= 1000000) {
.fwrite -n bigfile %i
inc %i
}
.fclose bigfile
filter -cffr 1-5000 test\bigfile.txt test\file1.txt
filter -cffr 5001-10000 test\bigfile.txt test\file2.txt
filter -cffr 10001-15000 test\bigfile.txt test\file3.txt
filter -cffr 15001-20000 test\bigfile.txt test\file4.txt
filter -cffr 20001-25000 test\bigfile.txt test\file5.txt
filter -cffr 25001-30000 test\bigfile.txt test\file6.txt
filter -cffr 30001-35000 test\bigfile.txt test\file7.txt
filter -cffr 35001-40000 test\bigfile.txt test\file8.txt
filter -cffr 40001-45000 test\bigfile.txt test\file9.txt
filter -cffr 45001-50000 test\bigfile.txt test\file10.txt
filter -cffr 50001-55000 test\bigfile.txt test\file11.txt
filter -cffr 55001-60000 test\bigfile.txt test\file12.txt
filter -cffr 60001-65000 test\bigfile.txt test\file13.txt
filter -cffr 65001-70000 test\bigfile.txt test\file14.txt
filter -cffr 70001-75000 test\bigfile.txt test\file15.txt
filter -cffr 75001-80000 test\bigfile.txt test\file16.txt
filter -cffr 80001-85000 test\bigfile.txt test\file17.txt
filter -cffr 85001-90000 test\bigfile.txt test\file18.txt
filter -cffr 90001-95000 test\bigfile.txt test\file19.txt
filter -cffr 95001-100000 test\bigfile.txt test\file20.txt
echo -a Finished in $calc(($ticks - %ticks) / 1000) seconds.
}
Regards,
MDA