mIRC Home    About    Download    Register    News    Help

Print Thread
Hash table/binary file size limit #187396 04/10/07 09:07 PM
Joined: Oct 2006
Posts: 14
H
hero12 Offline OP
Pikka bird
OP Offline
Pikka bird
H
Joined: Oct 2006
Posts: 14
Hash tables saved with the -b (binary) option skip item data longer than 65535 (instead of truncating it) but still saves the item name (instead of missing out the entire entry). When reading the table back it then reads the next item name as the missing data and so the rest of the table is useless. Data corruption occurs.

It is not seen often because to store more than 64K of data (such as an image) would normally be written to a single file.

I can see this is potentially hard to fix, the problem lies in the design of hash table saves. To correct it would require a length field bigger than 16 bits and so would be incompatible with other versions.

Re: Hash table/binary file size limit [Re: hero12] #187403 04/10/07 09:50 PM
Joined: Feb 2005
Posts: 342
R
Rand Offline
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
Could you supply an example of how you're "corrupting" a file with your code via hash table useage?

Because I can copy a very very large (well over 2mb) image into a binvar, put it into a hash table, extract it from the hash table, and then output the image just fine. See code below.

Code:
alias makenew {
  var %s = $$sfile($mircdir) , %qs = $qt(%s)
  var %path = $nofile(%s) , %file = $replace($nopath(%s),$chr(32),_)

  echo -a *** Makenew: Reading from: %qs
  bread %qs 0 $file(%qs) &in

  hadd -mb bin test &in
  .signal createfile %file %path
}
on *:signal:createfile:{
  noop $hget(bin,test,&out)
  var %f = $qt($+($2-,$ticks,.,$1))

  echo -a *** Makenew: Writing to: %f
  bwrite %f 0 -1 &out

  hfree bin
}


Edit: I've tested this on a 100mb video as well. Works fine. So I'm not sure why you're thinking there's a file size limit. As far as I know, hash table binary size is limited -only- by memory.

Last edited by Rand; 04/10/07 10:09 PM.
Re: Hash table/binary file size limit [Re: Rand] #187419 05/10/07 02:37 AM
Joined: Oct 2006
Posts: 14
H
hero12 Offline OP
Pikka bird
OP Offline
Pikka bird
H
Joined: Oct 2006
Posts: 14
I did stress the corruption happens when saving the hash table with -b option and subsequently loading it again. That is the command /hsave -b
The original hash table is left unaffected.

Re: Hash table/binary file size limit [Re: Rand] #187458 05/10/07 06:16 PM
Joined: Oct 2003
Posts: 313
S
Sais Offline
Fjord artisan
Offline
Fjord artisan
S
Joined: Oct 2003
Posts: 313
I think I've reproduced the error:
Code:
alias largehash.write {
  var %n = $$1, %i = 1

  while (%i <= %n) {
    bset &b %i $base($iif($isbit(%i,1),A5,5A),16,10)
    inc %i
  }
  if ($hget(largehash)) { hfree largehash }
  hmake largehash
  hadd -sb largehash item &b
  noop $hget(largehash,item,&c)
  echo -a largehash:item is $bvar(&c,0) in size
  hsave -sb largehash largehash.hash
}

alias largehash.read {
  if ($hget(largehash)) { hfree largehash }
  hmake largehash
  hload -sb largehash largehash.hash
  noop $hget(largehash,item,&b)
  echo -a largehash:item is $bvar(&b,0) in size
}


Code:
/largehash.write 65535
-> * Added item 'item' to hash table 'largehash'
-> largehash:item is 65535 in size
-> * Saved hash table 'largehash' to 'largehash.hash'

/largehash.read
-> * Loaded hash table 'largehash' from 'largehash.hash'
-> largehash:item is 65535 in size

/largehash.write 65536
-> * Added item 'item' to hash table 'largehash'
-> largehash:item is 65536 in size
-> * /hsave: error saving hash table 'largehash' to 'largehash.hash' (line 13, test.mrc)


It would appear that there is some issue with /hsave'ing binary items larger than 65535 bytes.

Last edited by Sais; 05/10/07 06:26 PM.

Sais
Re: Hash table/binary file size limit [Re: Sais] #187459 05/10/07 06:30 PM
Joined: Oct 2003
Posts: 313
S
Sais Offline
Fjord artisan
Offline
Fjord artisan
S
Joined: Oct 2003
Posts: 313
As qwerty pointed out:
Quote:
32./hsave -b now displays a warning when it is unable to save data that is longer than 65535 bytes.

(from versions.txt for 6.17)

... presumably because there is a length field in the data format that is 2 bytes long (hence can only encode 0-65535 bytes).

Last edited by Sais; 05/10/07 06:32 PM.

Sais
Re: Hash table/binary file size limit [Re: Sais] #187470 05/10/07 11:42 PM
Joined: Oct 2006
Posts: 14
H
hero12 Offline OP
Pikka bird
OP Offline
Pikka bird
H
Joined: Oct 2006
Posts: 14
I can now reproduce the warning message but I can also get the error without a warning. The only difference is in the contents of the table. When I work out what that is I shall get back.

Re: Hash table/binary file size limit [Re: hero12] #187492 06/10/07 06:46 AM
Joined: Feb 2005
Posts: 342
R
Rand Offline
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
Ah, I was assuming you meant you were saving the binvar into the hash table with -b.

I don't think I've ever had a reason to /hsave a large binary hash table. :|