Thanks once again Raccoon!

Since it doesn't sound like there's any real advantage to splitting everything into multiple ON TEXT, I'll probably join the crowd that prefers to keep it all in one and try minimizing lines.

On a kinda similar note, is there a better method between having one hash table with long data for each item (say 100-300 characters), or a few hash tables with the data split between them (so say 3 hash tables where each item has 100 character data)? I don't know how mirc manages hash table keys/items but I'd assume it would be faster to do a single $hget of 300 chars than 3 $hget's of 100 chars each. I don't care about memory but I do care about execution speed. Nobody likes a slow script. smile