Ok I've been converting chunk of script that read from large file over to hash tables and I've got this little piece I've been wondering about
on $*:text:$($catchURLex):#:{
if ($nick isop $chan) { return }
if ($($+(%,permission.,$nick,.,$chan),2) == On ) { return }
var %urls $extractURL($1-)
if (!%urls) { return }
if ($read($($+(Storage\whitelist.,$chan,.txt)), w, $nick)) { return }
else { addcommand kick # $nick Kicked for url while not whitelisted | halt }
}
In some cases the whitelist for a channel can become quite large..should I just make a hash table for this and destroy it afterwords? Would that actually be faster?
It seems to me like as it is now it would start at the top of the list and work its way down until it matches, then stop..but if I loaded a whole hashtable It would have to go over the whole list first, then compare? I'm not really familiar with the inner workings of this..so any insight would be nice.