Hello,

I've found an issue with High Resolution Timers in mIRC.

The minimum delay value in milliseconds that a timer can reach is 15,3846 ms on Windows XP. That means that a timer -h can only launch an alias maximum 65 times/second. (tested on a lot of WinXP PC, with Athlons and Pentiums up to 3Ghz, so its not a matter of CPU).

So this 15ms resolution is the same as the millisecond timer (timer -m). Thus there is no advantages in using -h.

But on Windows 98, on a slow Pentium II, the timer -h can launch a alias 2 times more than on WinXp, so up to 130x/seconds.

So High Resolution Timers that seems to be "performance related", are much better on a PII 366Mhz with Win98 than on a 3Ghz Pentium 4 with WinXP !!!

I didn'nt test this issue on Win2k or Me.

This is really a problem when using /timer -h 0 0, because it will be more effecient with old pc (wich use Win9x) !

So the resolution of timers -h looks like this:

Windows XP : 15ms = max. 65x/sec
Windows 98 : 7.5ms = max. 130x/sec

timer -h are perfect for picwin animations and games, because it can launch the main drawing alias a lot of times per seconds, without using while loops wich eat 100% of the CPU. With /timer -h a demo can "eat" only 2% of the CPU.

But with this 15ms resolution limitation, the animation is limited to 65 FPS, even on the fastest machines available.