mIRC Homepage
Posted By: ern0 Large text-files to Memory - 02/12/03 05:20 PM
Hi

I have one little script where you (all people in channels) can search matches for example "kikkoman" in my url-log and then the script lists them. (first five matches it finds)

Now..
Problem is that everytime it reads next line from the url-log, it reads it from harddisk. Normally this would not be a problem, but in this case it sure is. See my url-log contains over 70 thousand urls and mirc goes through about one thousand urls in 10 sec. When i look taskmanager i can see mirc reading that url-log at about 30 MB/s.

Worst part is that mirc is totally jammed whole time it searches.

It would increase performance surely like hell if you could place that large text-file (~10 Mt) to the memory and read lines out from there.

Hope you can do something to this. Thanks! :-)

ern0
..and sorry for my bad english

Posted By: starbucks_mafia Re: Large text-files to Memory - 02/12/03 07:43 PM
There are several pre-existing possiblities to do what you want. You could load the file into a...
a) Binary variable.
b) Hash table.
c) Hidden window.
Posted By: Netchelandorious Re: Large text-files to Memory - 04/12/03 12:18 AM
70 thousand lines in a hidden window? I think that's a bit excessive... heh. Also, I have noticed a significant decrease in speed using window functions since 6.1, I wouldn't recommend $fline for even one thousand lines.
Posted By: madewokherd Re: Large text-files to Memory - 05/12/03 09:22 PM
You're using the file-handling functions, not $read, right?
Posted By: ern0 Re: Large text-files to Memory - 06/12/03 11:34 AM
Now I am ;-)

Mirc goes through now over five thousand urls per second. Whole url-list through in 13sec.

Thanks!
Posted By: ern0 Re: Large text-files to Memory - 06/12/03 01:33 PM
There is still a problem.

$fread starts reading lines from the beginning of file, but all new URLs are at the end of file. I use mirc's own url-catcher but it wouldnt be any use of scripting new url-catcher because i cant place new urls on the beginning of file without replacing the old url.

Is there any way doing this? i didnt find any useful information from mirc's help.

Posted By: madewokherd Re: Large text-files to Memory - 06/12/03 04:25 PM
/fseek <name> <position>

Sets the read/write pointer to the specified position in the file. The following switches may also be used to move the file pointer:

-l <name> <line number>

So you'd probably want to use that to move the pointer to the line where the *new* url's start.
Posted By: Raccoon Re: Large text-files to Memory - 07/12/03 12:05 PM
Like madewokherd suggested, it would be faster if you started closer to the end of your file... then work backwards in chunks until you have accumulated 5 matches. Of course, this is slightly complicated and requires nested loops and keep track of your position.

It may be easier to simply break your urls file up into multiple files, by month or something.

- Raccoon
Posted By: ern0 Re: Large text-files to Memory - 10/12/03 08:39 PM
"write -il1 file.txt" was the command i was looking for.. problem solved :-)
Posted By: Raccoon Re: Large text-files to Memory - 11/12/03 06:24 PM
Watch how long it takes to insert a line at the beginning of a 30 meg file. You know it has to destroy the entire file and write it from the ground up, right?
© mIRC Discussion Forums