mIRC Home    About    Download    Register    News    Help

Print Thread
Large text-files to Memory #62621 02/12/03 05:20 PM
Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
Hi

I have one little script where you (all people in channels) can search matches for example "kikkoman" in my url-log and then the script lists them. (first five matches it finds)

Now..
Problem is that everytime it reads next line from the url-log, it reads it from harddisk. Normally this would not be a problem, but in this case it sure is. See my url-log contains over 70 thousand urls and mirc goes through about one thousand urls in 10 sec. When i look taskmanager i can see mirc reading that url-log at about 30 MB/s.

Worst part is that mirc is totally jammed whole time it searches.

It would increase performance surely like hell if you could place that large text-file (~10 Mt) to the memory and read lines out from there.

Hope you can do something to this. Thanks! :-)

ern0
..and sorry for my bad english


Re: Large text-files to Memory #62622 02/12/03 07:43 PM
Joined: Dec 2002
Posts: 2,962
S
starbucks_mafia Offline
Hoopy frood
Offline
Hoopy frood
S
Joined: Dec 2002
Posts: 2,962
There are several pre-existing possiblities to do what you want. You could load the file into a...
a) Binary variable.
b) Hash table.
c) Hidden window.


Spelling mistakes, grammatical errors, and stupid comments are intentional.
Re: Large text-files to Memory #62623 04/12/03 12:18 AM
Joined: Dec 2002
Posts: 31
N
Netchelandorious Offline
Ameglian cow
Offline
Ameglian cow
N
Joined: Dec 2002
Posts: 31
70 thousand lines in a hidden window? I think that's a bit excessive... heh. Also, I have noticed a significant decrease in speed using window functions since 6.1, I wouldn't recommend $fline for even one thousand lines.

Re: Large text-files to Memory #62624 05/12/03 09:22 PM
Joined: Sep 2003
Posts: 70
M
madewokherd Offline
Babel fish
Offline
Babel fish
M
Joined: Sep 2003
Posts: 70
You're using the file-handling functions, not $read, right?

Re: Large text-files to Memory #62625 06/12/03 11:34 AM
Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
Now I am ;-)

Mirc goes through now over five thousand urls per second. Whole url-list through in 13sec.

Thanks!

Re: Large text-files to Memory #62626 06/12/03 01:33 PM
Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
There is still a problem.

$fread starts reading lines from the beginning of file, but all new URLs are at the end of file. I use mirc's own url-catcher but it wouldnt be any use of scripting new url-catcher because i cant place new urls on the beginning of file without replacing the old url.

Is there any way doing this? i didnt find any useful information from mirc's help.


Re: Large text-files to Memory #62627 06/12/03 04:25 PM
Joined: Sep 2003
Posts: 70
M
madewokherd Offline
Babel fish
Offline
Babel fish
M
Joined: Sep 2003
Posts: 70
/fseek <name> <position>

Sets the read/write pointer to the specified position in the file. The following switches may also be used to move the file pointer:

-l <name> <line number>

So you'd probably want to use that to move the pointer to the line where the *new* url's start.

Re: Large text-files to Memory #62628 07/12/03 12:05 PM
Joined: Feb 2003
Posts: 2,651
Raccoon Offline
Hoopy frood
Offline
Hoopy frood
Joined: Feb 2003
Posts: 2,651
Like madewokherd suggested, it would be faster if you started closer to the end of your file... then work backwards in chunks until you have accumulated 5 matches. Of course, this is slightly complicated and requires nested loops and keep track of your position.

It may be easier to simply break your urls file up into multiple files, by month or something.

- Raccoon


Well. At least I won lunch.
Good philosophy, see good in bad, I like!
Re: Large text-files to Memory #62629 10/12/03 08:39 PM
Joined: Dec 2003
Posts: 4
E
ern0 Offline OP
Self-satisified door
OP Offline
Self-satisified door
E
Joined: Dec 2003
Posts: 4
"write -il1 file.txt" was the command i was looking for.. problem solved :-)

Last edited by ern0; 10/12/03 08:56 PM.
Re: Large text-files to Memory #62630 11/12/03 06:24 PM
Joined: Feb 2003
Posts: 2,651
Raccoon Offline
Hoopy frood
Offline
Hoopy frood
Joined: Feb 2003
Posts: 2,651
Watch how long it takes to insert a line at the beginning of a 30 meg file. You know it has to destroy the entire file and write it from the ground up, right?


Well. At least I won lunch.
Good philosophy, see good in bad, I like!