Hi,
From what I gather, for the sake of speed it's good practice to limit the number of $hget's you do to a minimum. If I have a hash with say 10 sub-values, but I'd only access 2-3 at a time, would it be better keep them all separate as their own hash table entries, or would it be faster to store all of them in a single hash entry using a field separator and then $gettok'ing out the fields I need? The entire amount of data for each entry+sub-entries would be about 500 characters or less with probably no more than 250 total different datasets. 11 table entries per dataset * 250 datasets = 2750 total table entries. 250 total datasets * 500 characters = 125,000 characters for a full table.
So for example, the hash table might look like:
item1: item description
item1.data1: some value
item1.data2: some value
item1.data3: some value
...
item1.data10: some value
var %data3 = $hget(table,item1.data3), %data5 = $hget(table,item1.data5), %data10 = $hget(table,item1.data10)
vs.
item1: description|value1|value2|value3|...|value10
var %temp = $hget(table,item1)
var %field3 = $gettok(%temp,3,124), %field5 = $gettok(%temp,5,124), %field10 = $gettok(%temp,10,124)
I'm guessing it would be faster to do the single $hget into the %temp var and then multiple $gettok's through the 500 characters rather than multiple $hget's through up to 125,000 characters. But that's only a guess and then is speed much of a factor at this table/data size?
Thanks!