mIRC Home    About    Download    Register    News    Help

Print Thread
#196712 22/03/08 04:36 AM
Joined: Apr 2007
Posts: 228
M
Mpot Offline OP
Fjord artisan
OP Offline
Fjord artisan
M
Joined: Apr 2007
Posts: 228
I wrote a script to basically, write numbers in sequential order to a text file. It's only doing, say, 50 a second. Can I make it go faster?

Code:
alias hunderedthousand {
  while (100000 > $lines(C:\IcyBot2\gazillion.txt)) { /write C:\IcyBot2\gazillion.txt $calc($lines(C:\IcyBot2\gazillion.txt) + 1) }
}


Also, is there a way to clock just exactly how many lines it's doing per second?

Last edited by Mpot; 22/03/08 04:41 AM.
Joined: Aug 2006
Posts: 183
T
Vogon poet
Offline
Vogon poet
T
Joined: Aug 2006
Posts: 183
A bit faster, maybe. Stop making it evaluate $lines(C:\IcyBot2\gazillion.txt) all the time.

Code:
alias hunderedthousand {
var %looplength = $lines(C:\IcyBot2\gazillion.txt)
  while (100000 > %looplength ) { /write C:\IcyBot2\gazillion.txt $calc($lines(C:\IcyBot2\gazillion.txt) + 1) }
}


However, reading and writing to a hard disk is always the slowest part of a program. The problem is compounded because Mirc is fairly slow to begin with.

I doubt you'll be able to do what you want within a decent timeframe.

In order to clock something just assign a variable to $ticks when the scripts starts, again when its done. Subtract the two and that's how many MILLIseconds it took to do.

Code:
var %time = $ticks
var %i = 1
While %i < 100000 { Inc %i }
/echo -a $calc($ticks - %time)


Adjust as you see fit.

Blast, I totally forgot about $fread and its ilk. See below for a much better version. (I blame lack of sleep for this entirely.)

Last edited by Thrull; 22/03/08 05:35 AM.

Yar
Joined: Dec 2002
Posts: 2,962
S
Hoopy frood
Offline
Hoopy frood
S
Joined: Dec 2002
Posts: 2,962
As Thrull mentioned, file access is a slow process in computing terms. When accessing a file there are three basic underlying steps: The file must first be opened. Then you can perform any number of reads and/or writes on the file. Then the file should be closed when you're done. /write is a self-contained command - it automatically opens the file, writes the necessary data, and closes the file afterwards. For simplicity that's great, for efficiency with repetitive writes of multiple lines it's very bad because it has to open the file and close it once for each write. The $lines() identifier also opens the file, reads the numbers of lines, and then closes it. Meaning that your current code is actually opening and closing the file 3 times for each line.

As Thrull pointed out, using a variable instead of $lines() will be a great deal more efficient (by removing two of the three unnecessary opens & closes per line)

mIRC also has lower level file access commands that allow you to perform multiple writes more efficiently by opening and closing the file yourself. This way you can only open the file once, perform all the writes, and close the file after you're done.

I've converted the code into three simple benchmark aliases to highlight the performance difference for each change to the code:
Code:
alias bm1 {
  var %start = $ticks
  while ($lines(gazillion.txt) < 10000) {
    write gazillion.txt $calc($lines(gazillion.txt) + 1)
  }
  echo -a Original benchmark took $calc(($ticks - %start) / 1000) seconds
}

alias bm2 {
  var %start = $ticks
  var %i = 1
  while (%i <= 10000) {
    write gazillion.txt %i
    inc %i
  }
  echo -a Write with counter variable benchmark took $calc(($ticks - %start) / 1000) seconds
}

alias bm3 {
  var %start = $ticks
  var %i = 1
  .fopen -o gaz gazillion.txt
  while (%i <= 10000) && (!$ferr) {
    .fwrite gaz %i $+ $crlf
    inc %i
  }
  .fclose gaz
  echo -a File handling benchmark took $calc(($ticks - %start) / 1000) seconds
}


(I used only 10,000 writes in the benchmarks, I didn't want to wait 5 minutes for the original to complete!)

Here's a sample of the results from my computer:
> /bm1
Original benchmark took 28.328 seconds
> /bm2
Write with counter variable benchmark took 4.031 seconds
> /bm3
File handling benchmark took 0.609 seconds

Obviously your figures will be slightly different but proportionately they should be roughly the same. Using file handling functions works out approximately 45x faster than your original code.


Spelling mistakes, grammatical errors, and stupid comments are intentional.
Joined: Sep 2005
Posts: 2,881
H
Hoopy frood
Offline
Hoopy frood
H
Joined: Sep 2005
Posts: 2,881
I was going to try and speed it up a little more by using /fwrite -n instead of $+ $crlf but interestingly enough it seems faster to use your method.

You can speed it up a little more by relying on an :error label instead of checking $ferr every time though.

Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
Additionally, you can just use a custom window and /savebuf to write to the drive only once. Just make a hidden custom window and write all of the numbers to that, then /savebuf it to your file.

EDIT: Don't forget to close your custom window when completed.

Last edited by Riamus2; 22/03/08 12:16 PM.

Invision Support
#Invision on irc.irchighway.net
Joined: Jan 2003
Posts: 2,523
Q
Hoopy frood
Offline
Hoopy frood
Q
Joined: Jan 2003
Posts: 2,523
Originally Posted By: hixxy
I was going to try and speed it up a little more by using /fwrite -n instead of $+ $crlf but interestingly enough it seems faster to use your method.

-n is consistently faster here, have you run the benchmarks multiple times to make up for odd cases (eg CPU spikes caused by other programs), caching etc?


/.timerQ 1 0 echo /.timerQ 1 0 $timer(Q).com
Joined: Sep 2005
Posts: 2,881
H
Hoopy frood
Offline
Hoopy frood
H
Joined: Sep 2005
Posts: 2,881
I even made sure that my alias was called before Starbucks' so that my system wasn't recovering from running his alias.

Code:
alias bm3 {
  var %start = $ticks
  var %i = 1
  .fopen -o gaz gazillion.txt
  while (%i <= 10000) && (!$ferr) {
    .fwrite gaz %i $+ $crlf
    inc %i
  }
  .fclose gaz
  echo -a (Starbucks) File handling benchmark took $calc(($ticks - %start) / 1000) seconds
}

alias bm4 {
  var %start = $ticks
  var %i = 1
  .fopen -o gaz gazillion.txt
  while (%i <= 10000) && (!$ferr) {
    .fwrite -n gaz %i
    inc %i
  }
  .fclose gaz
  echo -a (hixxy) File handling benchmark took $calc(($ticks - %start) / 1000) seconds
}

alias bm {
  var %i = 10
  while (%i) {
    bm4
    bm3
    dec %i
  }
}


..but I consistently get slower results with /fwrite -n:

Quote:
(hixxy) File handling benchmark took 1.016 seconds
(Starbucks) File handling benchmark took 0.843 seconds
(hixxy) File handling benchmark took 1.016 seconds
(Starbucks) File handling benchmark took 0.844 seconds
(hixxy) File handling benchmark took 1.031 seconds
(Starbucks) File handling benchmark took 0.844 seconds
(hixxy) File handling benchmark took 1.031 seconds
(Starbucks) File handling benchmark took 0.828 seconds
(hixxy) File handling benchmark took 1.015 seconds
(Starbucks) File handling benchmark took 0.844 seconds
(hixxy) File handling benchmark took 1.031 seconds
(Starbucks) File handling benchmark took 0.828 seconds
(hixxy) File handling benchmark took 1.016 seconds
(Starbucks) File handling benchmark took 0.844 seconds
(hixxy) File handling benchmark took 1.015 seconds
(Starbucks) File handling benchmark took 0.844 seconds
(hixxy) File handling benchmark took 1.016 seconds
(Starbucks) File handling benchmark took 0.844 seconds
(hixxy) File handling benchmark took 1.031 seconds
(Starbucks) File handling benchmark took 0.828 seconds


And then ones that use an error label:

Code:
alias bm3 {
  var %start = $ticks
  var %i = 1
  .fopen -o gaz gazillion.txt
  while (%i <= 10000) {
    .fwrite -n gaz %i
    inc %i
  }
  .fclose gaz
  echo -a (fwrite -n) File handling benchmark took $calc(($ticks - %start) / 1000) seconds
  return
  :error
  reseterror
  .fclose gaz
}

alias bm4 {
  var %start = $ticks
  var %i = 1
  .fopen -o gaz gazillion.txt
  while (%i <= 10000) {
    .fwrite gaz %i $+ $crlf
    inc %i
  }
  .fclose gaz
  echo -a (CRLF) File handling benchmark took $calc(($ticks - %start) / 1000) seconds
  return
  :error
  reseterror
  .fclose gaz
}

alias bm {
  var %i = 10
  while (%i) {
    bm4
    bm3
    dec %i
  }
}


Quote:
(CRLF) File handling benchmark took 0.75 seconds
(fwrite -n) File handling benchmark took 0.921 seconds
(CRLF) File handling benchmark took 0.766 seconds
(fwrite -n) File handling benchmark took 0.953 seconds
(CRLF) File handling benchmark took 0.75 seconds
(fwrite -n) File handling benchmark took 0.937 seconds
(CRLF) File handling benchmark took 0.75 seconds
(fwrite -n) File handling benchmark took 0.953 seconds
(CRLF) File handling benchmark took 0.75 seconds
(fwrite -n) File handling benchmark took 0.938 seconds
(CRLF) File handling benchmark took 0.765 seconds
(fwrite -n) File handling benchmark took 0.938 seconds
(CRLF) File handling benchmark took 0.75 seconds
(fwrite -n) File handling benchmark took 0.953 seconds
(CRLF) File handling benchmark took 0.75 seconds
(fwrite -n) File handling benchmark took 0.922 seconds
(CRLF) File handling benchmark took 0.75 seconds
(fwrite -n) File handling benchmark took 0.937 seconds
(CRLF) File handling benchmark took 0.766 seconds
(fwrite -n) File handling benchmark took 0.937 seconds


My computer isn't under any serious amount of stress but considering the amount of times I've ran the benchmarks it's very unlikely that -n would get unlucky that many times.

Joined: Apr 2007
Posts: 228
M
Mpot Offline OP
Fjord artisan
OP Offline
Fjord artisan
M
Joined: Apr 2007
Posts: 228
0_0

Thanks for the advice. It looks like benchmarking is a highly debated subject.

Anyway, Riamus: Would that considerably lower the amount of time taken if it didn't have to continually write? I've never really dealt with custom windows or the buffer. How would I do that? Something like /savebuf ?

Edit: /window -e @hunderdthousand

I used the -e switch so I could type in the /savebuf command afterwards. However, I'm still not sure how to edit the script.

Code:
alias hunderedthousand {
  var %looplength = $lines(@hunderedthousand)
  while (100000 > %looplength) { /write @hunderedthousand $calc($lines(@hunderedthousand) + 1) }
}


Would that work? I've really got no idea.

Last edited by Mpot; 22/03/08 03:29 PM.
Joined: Feb 2006
Posts: 546
J
Fjord artisan
Offline
Fjord artisan
J
Joined: Feb 2006
Posts: 546
hey there guys! i'm afraid this rather ridiculous adaptation of mine on your codes has you both ousted ;D

Code:
alias bm5 {
  var %start = $ticks
  var %i = 0
  .fopen -o gaz gazillion.txt
  while (%i < 10000) && (!$ferr) {
    .fwrite gaz $regsubex($str($crlf,100),/^/gm,$calc(%i + \n))
    inc %i 100
  }
  .fclose gaz
  echo -a (jaytea) File handling benchmark took $calc(($ticks - %start) / 1000) seconds
}


Quote:

(hixxy) File handling benchmark took 0.5 seconds
(Starbucks) File handling benchmark took 0.469 seconds
(jaytea) File handling benchmark took 0.265 seconds


long live $regsubex()!


"The only excuse for making a useless script is that one admires it intensely" - Oscar Wilde
Joined: Sep 2005
Posts: 2,881
H
Hoopy frood
Offline
Hoopy frood
H
Joined: Sep 2005
Posts: 2,881
I need to learn how to think outside the box more laugh

Even cheating can't top that frown

Code:
alias bm6 {
  var %start = $ticks, %i = 0
  .fopen gaz gazillion.txt
  while (%i < 10000) {
    .fwrite gaz $regsubex($str($crlf,100),/^/gm,$calc(%i + \n))
    inc %i 100
  }
  echo -a (hixxy+jaytea) File handling benchmark took $calc(($ticks - %start) / 1000) seconds
  .fclose gaz
  .remove gazillion.txt
  return
  :error
  reseterror
  .fclose gaz
  .remove gazillion.txt
}

Joined: Jan 2003
Posts: 2,523
Q
Hoopy frood
Offline
Hoopy frood
Q
Joined: Jan 2003
Posts: 2,523
Code:
alias bm7 {
  var %start = $ticks
  bset -t &a 1 $str($lf,900)
  while ($bvar(&a,0) < 7200) bcopy &a $calc($v1 + 1) &a 1 -1
  bcopy &a 7201 &a 1 2800
  btrunc gazillion.txt 0
  bwrite gazillion.txt 0 -1 &a
  filter -ffnc gazillion.txt gazillion.txt
  echo -a (qwerty) Filter benchmark took $calc(($ticks - %start) / 1000) seconds
}

Quote:
(jaytea) File handling benchmark took 0.703 seconds
(qwerty) Filter benchmark took 0.078 seconds

laugh


/.timerQ 1 0 echo /.timerQ 1 0 $timer(Q).com
Joined: Feb 2006
Posts: 546
J
Fjord artisan
Offline
Fjord artisan
J
Joined: Feb 2006
Posts: 546
hoooh, magnanimous!


"The only excuse for making a useless script is that one admires it intensely" - Oscar Wilde
Joined: Feb 2006
Posts: 546
J
Fjord artisan
Offline
Fjord artisan
J
Joined: Feb 2006
Posts: 546
something just came to mind, i think it can be perfected with:

Code:
bset &a 10000 0
breplace &a 0 10


to fill the binvar ;D


"The only excuse for making a useless script is that one admires it intensely" - Oscar Wilde
Joined: Jan 2003
Posts: 2,523
Q
Hoopy frood
Offline
Hoopy frood
Q
Joined: Jan 2003
Posts: 2,523
Nice one!


/.timerQ 1 0 echo /.timerQ 1 0 $timer(Q).com
Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
Here's the /savebuf test:

Code:
alias bm9 {
  window -h @bm
  var %start = $ticks
  var %i = 0
  while (%i < 10000) {
    aline @bm %i
    inc %i 1
  }
  savebuf @bm bm.txt
  echo -a File handling benchmark took $calc(($ticks - %start) / 1000) seconds
  window -c @bm
}


After multiple tests, times range from 0.312 - 0.343 seconds.

Mpot, if you did this method, just remove the echo and rename the window name (if desired) and the filename (if desired) and the alias name (if desired).


Invision Support
#Invision on irc.irchighway.net
Joined: Aug 2006
Posts: 183
T
Vogon poet
Offline
Vogon poet
T
Joined: Aug 2006
Posts: 183
You guys scare me sometimes. Ever think of combining your brain power and curing cancer or something? smile


Yar
Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
I know nothing of medicine. It wouldn't be much use. smile


Invision Support
#Invision on irc.irchighway.net

Link Copied to Clipboard