|
Joined: Jul 2005
Posts: 37
Ameglian cow
|
OP
Ameglian cow
Joined: Jul 2005
Posts: 37 |
hi has anyone got a good fast way of listing files right now im using $findfile but its realy slow sumtimes theres over 10000 files to list and it takes hours to do heres my code right now alias que {
if ($hget(Files,0).item) { hfree Files }
var %dir = $$sdir($mircdir,Select Dir) , %loop = 1 , %time = $ticks
echo Please Wait Adding Files To Que
while (%loop <= $findfile(%dir,*.*,0)) {
if ($right($findfile(%dir,*.*,%loop),4) !== .NEW) { hadd -m Files %loop $findfile(%dir,*.*,%loop) | var %total = $calc(%total + 1) }
inc %loop
}
echo Added %total Files To Que In $round($calc(($ticks - %time) / 1000),0) Seconds
} please help thanks
|
|
|
|
Joined: Dec 2002
Posts: 2,962
Hoopy frood
|
Hoopy frood
Joined: Dec 2002
Posts: 2,962 |
$findfile() has an internal loop mechanism where you can provide a command to be called for each file it finds. This will work out a lot faster than a regular while loop over $findfile() and much faster than your existing code which calls $findfile multiple times for each file it finds. This code should work out a great deal faster than what you've got at the moment however it still won't be anywhere near instant when matching 100,000 files - that's simply a limitation of hard drive access speeds and mIRC's single-threaded implementation. alias que {
if ($hget(Files,0).item) { hfree Files }
var %dir = $$sdir($mircdir,Select Dir) , %time = $ticks
echo Please Wait Adding Files To Que
var %total = $findfile(%dir, *.*, 0, if (*.NEW !iswmcs $1-) hadd -m Files $findfilen $1-)
echo Added %total Files To Que In $round($calc(($ticks - %time) / 1000),0) Seconds
} A quick run of this alias over my entire C:\ drive (105098 files) took 30 seconds.
Last edited by starbucks_mafia; 11/03/09 05:47 PM.
Spelling mistakes, grammatical errors, and stupid comments are intentional.
|
|
|
|
Joined: Jul 2006
Posts: 4,157
Hoopy frood
|
Hoopy frood
Joined: Jul 2006
Posts: 4,157 |
alias que {
if ($hget(Files)) hfree Files
var %dir = $$sdir($mircdir,Select Dir) ,%time = $ticks
echo Please Wait Adding Files To Que
noop $findfile(%dir,*.*,0, if ($right($1-,4) !== .NEW) hadd -m Files $findfilen $1-)
echo Added $hget(Files,0).item Files To Que In $calc($ticks - %time) ms
} $findfile already have a fonction to do a command on each files found, instead of using a loop that slow down the process.This will be faster but you can't avoid the freeze and the fact that it will take time with 10000 files... Edit : Beaten
Last edited by Wims; 11/03/09 05:49 PM.
#mircscripting @ irc.swiftirc.net == the best mIRC help channel
|
|
|
|
Joined: Jul 2005
Posts: 37
Ameglian cow
|
OP
Ameglian cow
Joined: Jul 2005
Posts: 37 |
wow thanks iv bin waiting over 2hours to add 23955 files your way did it in 27 seconds thanks alot
thanks to both for a reply i tested starbucks_mafia's code and it works nice
Last edited by gans; 11/03/09 05:56 PM.
|
|
|
|
Joined: Dec 2002
Posts: 2,962
Hoopy frood
|
Hoopy frood
Joined: Dec 2002
Posts: 2,962 |
Hmm. There appears to be a bug with $findfilen being limited to 65535 ( just posted about it here) which will cause problems if you use it on folders with more than that many files. Until the bug is fixed you can use the following code which uses its own incrementing variable instead of the built-in $findfilen identifier:
alias que {
if ($hget(Files,0).item) { hfree Files }
var %dir = $$sdir($mircdir,Select Dir) , %time = $ticks, %count = 1
echo Please Wait Adding Files To Que
var %total = $findfile(%dir, *.*, 0, if (*.NEW !iswmcs $1-) [ $chr(123) ] hadd -m Files %count $1- [ $chr(124) ] inc %count [ $chr(125) ] )
echo Added %total Files To Que In $round($calc(($ticks - %time) / 1000),0) Seconds
}
Edit: Fixed it, I left in some debugging stuff that broke the code.
Last edited by starbucks_mafia; 11/03/09 06:16 PM.
Spelling mistakes, grammatical errors, and stupid comments are intentional.
|
|
|
|
Joined: Jul 2005
Posts: 37
Ameglian cow
|
OP
Ameglian cow
Joined: Jul 2005
Posts: 37 |
thanks for the info i think ill be fine with 65535 files max most iv ever done befor is less than 30000 so i shud be fine about half of the max allowed
|
|
|
|
Joined: Mar 2007
Posts: 218
Fjord artisan
|
Fjord artisan
Joined: Mar 2007
Posts: 218 |
Interestingly i get an error with yours. * /fwrite: invalid parameters Yeah i just saw, nice code! I'll find this very handy myself
Last edited by vexed2; 11/03/09 06:20 PM.
|
|
|
|
Joined: Dec 2002
Posts: 2,962
Hoopy frood
|
Hoopy frood
Joined: Dec 2002
Posts: 2,962 |
Yeah sorry I've fixed the above code now. I was debugging with a file instead of a hash table.
Spelling mistakes, grammatical errors, and stupid comments are intentional.
|
|
|
|
Joined: Jul 2005
Posts: 37
Ameglian cow
|
OP
Ameglian cow
Joined: Jul 2005
Posts: 37 |
theres only 1 prob im having its adding the .new as a added pack to que looking at your code it seems to skip the .new files but its not i also tryed Wims code it does the same i dont know why its doing it tho any help on that bit plz
thanks
|
|
|
|
Joined: Dec 2002
Posts: 2,962
Hoopy frood
|
Hoopy frood
Joined: Dec 2002
Posts: 2,962 |
The code I've given above does a case-sensitive check for the .NEW file-extension since that's what your original code was doing. Do the files you're using have lowercase or mixed-case extensions (eg .new .New)? If so just change the usage of !iswmcs in the code to !iswm
Spelling mistakes, grammatical errors, and stupid comments are intentional.
|
|
|
|
Joined: Jul 2005
Posts: 37
Ameglian cow
|
OP
Ameglian cow
Joined: Jul 2005
Posts: 37 |
iv fixed it i was modding Wims code he removed my .item from $hget
|
|
|
|
|