Greetings DaveC,

You wrote, And you really should use the filter command OR at least switch to the /fopen $fread /fwrite /fclose commands.
"I say this because, when you read a line in a txt file using $read it reads from the begining of the file to the line number you requested, so reading a 100,000 line file means that line 1 takes a read of 1 line , line 2 takes a read of 2 lines, line 3 takes a read of 3 lines, etc by the time you have read to line 100,000 you have read a total of 5,000,000,000 lines, a large number indeed!"

DaveC, $read, $write commands also include the ability to remove a specific number line of text after that variable has been read and written. One line would be all that would be needed to read at any time, that line written to another text file, a counter variable increased and then the original top line in the fat text file removed. It's also likely FASTER than using the Filter command.

A Simple Working Example:

alias TextFileSplitter {
set %textnumb 1
set %textnum 5000
:next
if (%textnumb > %textnum) { /goto finish }
set %textee $read($mircdirtest\fatext.txt,1)
if (%textee == $null) { echo 4 Data Textee is null | /goto finish }
.write $mircdirtest\text1.txt %textee
.write -dl $+ 1 $mircdirtest\fatext.txt
set %textee $null
inc %textnumb
goto next
:finish
}

MDA

Last edited by MDA; 05/07/05 08:55 PM.