Getting right back to the first thing, I would assume your doing something drasticly wrong with the /WRITE operation, are u just adding lines to the file, or are you inserting them?
becuase as Russel said, just writing 10,000 lines takes stuff all time. And if its not the write I wonder if its the loop your using to fetch each line for writing, maybe its flawed.
I have seen several reasons a script that must loop well start falling on its butt, one of the biggest and simplest mistakes someone well make looks like this

var %i = 1 | while (%i <= $fline(#channel,*matchtext*,0)) { .......code using $fline(#channel,*matchtext*,%i)....... | inc %i }
^ that goes and searches for every matchtext getting the total every single time it loops!

var %i = 1, %m = $fline(#channel,*matchtext*,0) | while (%i <= %m) { .......code using $fline(#channel,*matchtext*,%i)....... | inc %i }
^ thats much better since it need only check the total once

window -h @window | filter -wwc #channel @window *matchtext* | var %i = 1, %m = $line(@window,0) | while (%i <= %m) { .......code using $line(@window,%i)....... | inc %i } | window -c @window
^ that should be faster again (but uses a @window, i however assume you didnt want to use one from lack of knowledge on them as you said)

alias filter.alias.name { .......code using $1....... } | ; note $1 contains the WHOLE LINE! see tokenize to seperate!
&
filter -wk #channel filter.alias.name *matchtext*
^ that is about as fast as i can think of looping with code that you have to execute

filter -wf #channel filename.txt *matchtext*
^ and thats about as fast as fast as you can get selected lines out to a file

---

Without sample code to show us what your doing i doubt you well find a solution. thats well make your problem go away.
The best i can suggest if its JUST write speed is that you switch to using /FWRITE whcih requeres you to open, write and close the file your writting to.