mIRC Home    About    Download    Register    News    Help

Print Thread
#104499 04/12/04 11:30 AM
Joined: Oct 2004
Posts: 32
burek Offline OP
Ameglian cow
OP Offline
Ameglian cow
Joined: Oct 2004
Posts: 32
Hi everybody.
I've got one, sort of, annoying problem.. I need to find all the
files in a given directory, and all its subdirectories, and to
put the result in the list.txt. Here is my solution, but the
problem is that, it is too slow...

Code:
  write -c list.txt
  var i 1
  var %filename $findfile(c:\,*.*,%i)
  while (%filename) {
    write list.txt %filename
    inc %i
    set %filename $findfile(c:\,*.*,%i)
  }


Is there any faster solution? frown
thanx in advance..

Joined: Jan 2003
Posts: 2,523
Q
Hoopy frood
Offline
Hoopy frood
Q
Joined: Jan 2003
Posts: 2,523
Calling $findfile(c:\,*,%i) repeatedly is extremely slow. You should take advantage of the command parameter in $findfile (type /help $findfile and read the entire section carefully). Even if you do that, the /write command is still going to slow down the script because of the constant open/close file operations. Here's a fast way:
Code:
  .fopen -no blah list.txt
  if $ferr { return }
  !.echo -q $findfile(c:\,*,0,.fwrite -n blah $1-)
  .fclose blah

Alternatively, you can do it with the DIR command:
Code:
/run -n cmd /c dir /b /s c:\ > list.txt


/.timerQ 1 0 echo /.timerQ 1 0 $timer(Q).com
Joined: Oct 2004
Posts: 32
burek Offline OP
Ameglian cow
OP Offline
Ameglian cow
Joined: Oct 2004
Posts: 32
thanx man, I totally forgot for the comman option in $findfile..
anyway, I've noticed that /run with DIR works much much
faster..

anyway, thanx for such a fast reply wink

Joined: Nov 2003
Posts: 228
S
Fjord artisan
Offline
Fjord artisan
S
Joined: Nov 2003
Posts: 228
Quote:

Alternatively, you can do it with the DIR command:
/run -n cmd /c dir /b /s c:\ > list.txt


Nice snippet, I'm going to have to remember that one.


Link Copied to Clipboard