mIRC Home    About    Download    Register    News    Help

Print Thread
#180258 06/07/07 05:45 PM
Joined: Jan 2004
Posts: 509
L
Fjord artisan
OP Offline
Fjord artisan
L
Joined: Jan 2004
Posts: 509
Could there be an identifier to return the line return to /write?

Kind of like how $readn is to $read, but $writen to the line of file.txt in /write.

Why?

From the mIRC help file..

Quote:
/write -dstest c:\info.txt

This will scan file info.txt for a line beginning with the word "test" and if found, deletes it.


So if I have:

Nick:Ctime:Channel:text in seen.txt.

And I want to replace the new text to old text on a on text event, I would want to..

/write -ds $+ $nick seen.txt $nick etc.

But the file will never find $nick, it will find $nick:$ctime:$chan, etc., so that I have too...

/write -ds $+ $gettok(<the /write line>,1,58) seen.txt $nick etc..

Also..

In seen.txt, if I have.

Nick1:ctime:channel:text.
Nick2:ctime:channel:text.

And for an on text event, I have:

/write -s $+ $nick file.txt $nick etc.

Where $nick is Nick1.

That will override nick1's old text with new next.

But if I delete the sheet...

Then I can't use /write -s because it won't find it.

I would have to /write seen.txt $nick etc.

So I would have to check that $nick is in the 1st word of any line in seen.txt in order to /write -s it, but if $nick isn't even the file, then it won't /write to it via the -s switch.

So I'd like an identifier to whether or not /write has successfully been written.

I tried.

Code:
on *:text:*:#channel: { 
  write -w seen.txt $nick $+ : $+ [ $ctime ] $+ : $+ [ $target ] $+ : $+ [ $network ] $+ :saying " $+ [ $1- ] $+ "
  else { write -ws $+ $nick $+ : seen.txt $nick $+ : $+ [ $ctime ] $+ : $+ [ $target ] $+ : $+ [ $network ] $+ :saying " $+ [ $1- ] $+ " }
}


But of course, that won't work because else needs a previous if.

So maybe a:

/write -s seen.txt
if (!$writed) /write seen.txt

Any suggestions?

-Neal.

Last edited by LostShadow; 06/07/07 05:49 PM.
Joined: Jan 2004
Posts: 509
L
Fjord artisan
OP Offline
Fjord artisan
L
Joined: Jan 2004
Posts: 509
Greetings, it seems there are several issues regarding the /write command.

It seems that mIRC really slow down then you do the /write command on a large text file. When I had a file that was 15,000 kb (roughly 15 MB), my mIRC was just incredibly slow, and had to Ctrl Break infinitely until I stopped the seen script for a year. Over the year I went about editing the code, to make it more fast and efficient. Still no luck

Then, I decided to delete the seen.txt file and start over on a completely new sheet. Then, everything went back to normal.

Now I'm at about 7,000 kb (half as original), so my mIRC often has a heartbeat of pausing seconds.

I think a lot of the problem has to do with /write -il1.

/write -il1 inserts text to the 1st line, and moves it down, whereas /write just adds the text to the end of the file.

It seems that /write is much faster.

My theory is, /write -il1 inserts text to the 1st line, then moves the preceding text to the 2nd line, the preceding 2nd line text to the 3rd line, all the way until all the thousands of lines are moved down 1 line, just so the 1st line is added.

After hearing my mIRC go slow with pauses, I decided to remove the -il1 in /write -il1 to see if it made any difference. It seemed it did.

So if /write -il1 just, moves the last line 1 step down, the 2nd last line 1 step down, all the way until the 1st line moves to the 2nd line, then adds the next text to the 1st line, then, well, that's way too slow.

A faster and more efficient way is to just go to the 1st line, press Enter, then add the text to the 1st line for /write -il1. Of course, I don't know if that's possible...

And is my theory correct about the /wrile -il1?

Heh.

I kind of wanted to cut the file in half but then, highlighting from the bottom up is very very slow for the thousands of lines the page has.

The reason I want:

Text3
Text2
Text1

Instead of..

Text1
Text2
Text3

In seen.txt is because $read() works from top down, making it faster to find the recent text than the old (text1 is old, text100 is new).

So if I wanted to do the latter, I'd like mIRC to be able to $read() a file from the bottom up, which I didn't think was possible unless I while looped I guess.

And I was wondering if mIRC (like it can with it's logging functions), trim a file.txt to most amount of lines or bytes so the file.txt doesn't go infinitely forever big.

In a way, I'd like to be able to save the last 5 text of everyone in seen.txt, so if someone knew a scripting way to /write -d the 6th text if there are already 5, that would be great, thanks.

-Neal.

Joined: Aug 2004
Posts: 7,252
R
Hoopy frood
Offline
Hoopy frood
R
Joined: Aug 2004
Posts: 7,252
/write is a slow command due to the fact that it writes to the drive, which is the slowest storage device used on the modern computer (with floppy drives actually being the slowest)

Large files (especially that big) are going to be slow to work with.

If you need to read/write to a file that big, I suggest you look at the /help File Handling

Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
Originally Posted By: LostShadow
Greetings, it seems there are several issues regarding the /write command.


No, there aren't. Not in the way that you mean. Yes, it's slow, but that's the nature of it and can't really be changed. That's what /fread, /fwrite, etc. are for, or even hash tables. The slow file access time on a huge file isn't a bug, nor is it a reason for a feature request. A $writen (equivalent to $readn) that you suggested in the first post in this thread is a feature request, but having difficulty handling large files is not... it's a script help request.

That said, if you have that large of a file for a seen script, then convert the seen script to use hash tables. Alternatively, use /fwrite as it's much faster than /write.

Of course, I don't really see why you're inserting a line if it's a seen script. You can always read the last line(s) of a file by using $lines and $read together. If you need more than just the last line, a loop that decrements a counter variable would do it. Of course, I still strongly recommend a hash table method.

If you want to delete a line, just check $lines, and if $lines > 5, use /write -dl6. If you start out with that in the code, you'll never have more than 5 lines.

EDIT:
Originally Posted By: LostShadow
So if I have:

Nick:Ctime:Channel:text in seen.txt.

And I want to replace the new text to old text on a on text event, I would want to..

/write -ds $+ $nick seen.txt $nick etc.

But the file will never find $nick, it will find $nick:$ctime:$chan, etc., so that I have too...

/write -ds $+ $gettok(<the /write line>,1,58) seen.txt $nick etc..


No, you wouldn't.

Just do this:
Code:
if ($read(seen.txt,s,$nick $+ :)) { write -ds $+ $readn seen.txt }
else write seen.txt Whatever you're writing when there isn't something to delete.

Last edited by Riamus2; 10/07/07 05:10 AM.

Invision Support
#Invision on irc.irchighway.net
Joined: Jan 2004
Posts: 509
L
Fjord artisan
OP Offline
Fjord artisan
L
Joined: Jan 2004
Posts: 509
Originally Posted By: Riamus2

Just do this:
Code:
if ($read(seen.txt,s,$nick $+ :)) { write -ds $+ $readn seen.txt }
else write seen.txt Whatever you're writing when there isn't something to delete.


Perfect, except, that finds a way that a word is the file once. I kind of wanted it to count the number of times a starting-line word word is in there 5 times, then keep overriding the last 5th.

Thanks.

-Neal.

Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
If you have to use a text file, then why not consider an INI file?

[nick]
line1=jdfjds
line2=jdfjds
line3=jdfjds
line4=jdfjds
line5=jdfjds

You can then do 5 writeini's... copy 4 to 5, 3 to 4, 2 to 3, 1 to 2, and create a new 1. Whatever you do, a hash table will still be considerably faster.

Note that you can do the same writeini idea by including a counter number on every nick... $nick1, $nick2, etc. Then, copy 4 to 5, 3 to 4, etc. That will probably be slower than the INI, though. And both are slower than the hash table.


Invision Support
#Invision on irc.irchighway.net
Joined: Dec 2002
Posts: 2,031
R
Hoopy frood
Offline
Hoopy frood
R
Joined: Dec 2002
Posts: 2,031
...or just use the nick as the item, $ctime as the data and throw ordering out the window. I would just use a hash table or at least use a @window.

Joined: Oct 2005
Posts: 1,741
G
Hoopy frood
Offline
Hoopy frood
G
Joined: Oct 2005
Posts: 1,741
The fact that /write is slow cannot be changed. If you opened a 15M file in notepad, altered it, and then saved it again, notepad would be just as slow as mIRC.

By the looks of your comments, it seems like your problems are because your 'seen' script is either inefficient or incorrectly adding names to your list. Assuming the average length of a Nick:ctime:channel:text line is 100-150 characters, that means you have 100000-150000 individual nicks listed. I find that hard to believe. I don't know how your script is working, but I don't think that it should have more than a few thousand entries at most. Without seeing your script, I can't judge whether it is working properly or not, but I would suspect not.

There are many ways that you could make the script more efficient. Including the methods that were mentioned above (hash tables being the most efficient), you could also split your single file into several smaller files. This could be done in several ways as well.

1. You could separate the lists by the first letter of the nickname. ie. You could have seen.a.txt, seen.b.txt, seen.c.txt, etc. Since you always know what the first character of the nick you are searching/updating, you always know which file you need to $read or /write. By splitting the large file into multiple files, you significantly speed up the search/update process.

var %file = $+(seen.,$lower($left($nick,1)),.txt)
var %read = $read(%file,w,$+($nick,:*))


2. Another option would be to create a SEEN folder and make a separate .ini or .txt file for each nickname. Obviously, this would speed up the search/update process because you wouldn't have to search anything. Personally, I would use .ini files for this method. Using ini files would also allow you to expand the seen information to include more data (last x quit messages, last x hosts, etc). You could then use that information to expand your seen script's features, or expand other parts of your scripts to use that same information. You could also create separate folders or separate headers within the ini to represent different networks if you use more than one network.

var %file = $+(seen/,$nick,.txt)
if ($exists(%file) {
writeini %file $network lastseen $ctime
writeini %file $network lastmsg $1-
}

-genius_at_work

Joined: Jan 2004
Posts: 509
L
Fjord artisan
OP Offline
Fjord artisan
L
Joined: Jan 2004
Posts: 509
Wow, thanks Genius, I actually like your alphabet idea. Made a small correction to change seen.txt to $+(seen.,$lower($left($nick,1)),.txt) for the /write and $read(), and the /seen alias.

That just cuts my file size to 26 unevenly pieces. That means I can expect around 26x the time before I lag. I did thought about making a file for every nick but, meh. Maybe someday if I join thousands of channels... wink

Joined: Jan 2004
Posts: 509
L
Fjord artisan
OP Offline
Fjord artisan
L
Joined: Jan 2004
Posts: 509
Just some small minor problems, such as users starting their nick with the character |.

* /write: unable to open 'C:\mIRC\seen.|.txt' (line 786, seen.ini)

Then again, very small issue. Maybe I could use $chr(124) or $(|) or something.

So I was wrong about the 26 parts, heh.

-Neal.

Joined: Dec 2002
Posts: 2,962
S
Hoopy frood
Offline
Hoopy frood
S
Joined: Dec 2002
Posts: 2,962
The simplest fix would be to use $mkfn($lower($left($nick,1))) - that way if the character is invalid in a filename it will be converted to an underscore. You'd have all chars like _|/\: in the same file, but those characters are relatively rare at the start of a nickname so it shouldn't make that file any larger than the others.


Spelling mistakes, grammatical errors, and stupid comments are intentional.
Joined: Oct 2005
Posts: 1,741
G
Hoopy frood
Offline
Hoopy frood
G
Joined: Oct 2005
Posts: 1,741
I meant to make that change before I posted, but I guess I forgot. What can I say, it was 7am.

@OP

Even though that has 'fixed' your problem, you should ensure that your script isn't still making duplicate entries in your txt files. There may be bugs in your code that need to be fixed as well.

-genius_at_work

Joined: Jan 2004
Posts: 509
L
Fjord artisan
OP Offline
Fjord artisan
L
Joined: Jan 2004
Posts: 509
Okay I used the $mkfn().

And now there's an exploit in the seen script.

Good news is, my seen script provides wildcard matches.

I/anyone can do !seen *

Or !seen * *

Or !seen *?*?* *?*?* *?*?*

And it works instantly.

But if I/someone does...

seen *?*?*?*?*?*?*?*?*?*?*?*?*?*?*?*?*?*?*?*?* (exact amount I don't recall) my mIRC freezes for, persay, hours.

And I don't know why... Ctrl Breaking does nothing.

-Neal.

Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
The more wildcard matches you do, the more work mIRC has to do to find any matches. If you're concerned over that, then put a limit on the number of *'s or ?'s that are allowed.


Invision Support
#Invision on irc.irchighway.net
Joined: Oct 2005
Posts: 1,741
G
Hoopy frood
Offline
Hoopy frood
G
Joined: Oct 2005
Posts: 1,741
Put this in your !seen event:

Code:
if ($count($2-,*,?) > 5) { msg $nick Too many wildcards | return }



-genius_at_work

Joined: Apr 2003
Posts: 342
M
Fjord artisan
Offline
Fjord artisan
M
Joined: Apr 2003
Posts: 342
Use file handles!

/fopen
/fseek
/fwrite
/fclose

Use them! Use them! Use them!


Beware of MeStinkBAD! He knows more than he actually does!

Link Copied to Clipboard