mIRC Home    About    Download    Register    News    Help

Print Thread
#62621 02/12/03 05:20 PM
Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
Hi

I have one little script where you (all people in channels) can search matches for example "kikkoman" in my url-log and then the script lists them. (first five matches it finds)

Now..
Problem is that everytime it reads next line from the url-log, it reads it from harddisk. Normally this would not be a problem, but in this case it sure is. See my url-log contains over 70 thousand urls and mirc goes through about one thousand urls in 10 sec. When i look taskmanager i can see mirc reading that url-log at about 30 MB/s.

Worst part is that mirc is totally jammed whole time it searches.

It would increase performance surely like hell if you could place that large text-file (~10 Mt) to the memory and read lines out from there.

Hope you can do something to this. Thanks! :-)

ern0
..and sorry for my bad english


Joined: Dec 2002
Posts: 2,962
S
Hoopy frood
Offline
Hoopy frood
S
Joined: Dec 2002
Posts: 2,962
There are several pre-existing possiblities to do what you want. You could load the file into a...
a) Binary variable.
b) Hash table.
c) Hidden window.


Spelling mistakes, grammatical errors, and stupid comments are intentional.
Joined: Dec 2002
Posts: 31
N
Ameglian cow
Offline
Ameglian cow
N
Joined: Dec 2002
Posts: 31
70 thousand lines in a hidden window? I think that's a bit excessive... heh. Also, I have noticed a significant decrease in speed using window functions since 6.1, I wouldn't recommend $fline for even one thousand lines.

Joined: Sep 2003
Posts: 70
M
Babel fish
Offline
Babel fish
M
Joined: Sep 2003
Posts: 70
You're using the file-handling functions, not $read, right?

Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
Now I am ;-)

Mirc goes through now over five thousand urls per second. Whole url-list through in 13sec.

Thanks!

Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
There is still a problem.

$fread starts reading lines from the beginning of file, but all new URLs are at the end of file. I use mirc's own url-catcher but it wouldnt be any use of scripting new url-catcher because i cant place new urls on the beginning of file without replacing the old url.

Is there any way doing this? i didnt find any useful information from mirc's help.


Joined: Sep 2003
Posts: 70
M
Babel fish
Offline
Babel fish
M
Joined: Sep 2003
Posts: 70
/fseek <name> <position>

Sets the read/write pointer to the specified position in the file. The following switches may also be used to move the file pointer:

-l <name> <line number>

So you'd probably want to use that to move the pointer to the line where the *new* url's start.

Joined: Feb 2003
Posts: 2,812
Hoopy frood
Offline
Hoopy frood
Joined: Feb 2003
Posts: 2,812
Like madewokherd suggested, it would be faster if you started closer to the end of your file... then work backwards in chunks until you have accumulated 5 matches. Of course, this is slightly complicated and requires nested loops and keep track of your position.

It may be easier to simply break your urls file up into multiple files, by month or something.

- Raccoon


Well. At least I won lunch.
Good philosophy, see good in bad, I like!
Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
"write -il1 file.txt" was the command i was looking for.. problem solved :-)

Last edited by ern0; 10/12/03 08:56 PM.
Joined: Feb 2003
Posts: 2,812
Hoopy frood
Offline
Hoopy frood
Joined: Feb 2003
Posts: 2,812
Watch how long it takes to insert a line at the beginning of a 30 meg file. You know it has to destroy the entire file and write it from the ground up, right?


Well. At least I won lunch.
Good philosophy, see good in bad, I like!

Link Copied to Clipboard