Thanks for the information, i will have to look into hash tables in general a bit more and do a little testing with large sets to find my respective answers.

Good idea with the timer and alias. I havn't started scripting anything yet, i like to plan the basic working and order of coding before i actually start the code, in any language. I did concider using sql tables due to the potentially large number of entries, and even though i always use mysql for other things myself i decided this requirement would act as nothing more than a deterant for others who may with to use the script.

Thanks again for the information, i have used hash tables in the past but nothing of this size, just wanted to do a little research into possible limitations and memory usage.

I am also concidering using seperate tables for different catagories of data, such as mytable1 mytable2 where 1 and 2 represent something in the data that can be checked with an if statement to determine which table to search with the $hfind. Of corse i would prefer not to take this route, but this depends entirely on the results of speed and cpu testing for very large tables. I assume the memory usage would infact be less for a single table since the same data is there, but less information to remember (details of each table etc), so should not be a factor in this decision.