Could you supply an example of how you're "corrupting" a file with your code via hash table useage?

Because I can copy a very very large (well over 2mb) image into a binvar, put it into a hash table, extract it from the hash table, and then output the image just fine. See code below.

Code:
alias makenew {
  var %s = $$sfile($mircdir) , %qs = $qt(%s)
  var %path = $nofile(%s) , %file = $replace($nopath(%s),$chr(32),_)

  echo -a *** Makenew: Reading from: %qs
  bread %qs 0 $file(%qs) &in

  hadd -mb bin test &in
  .signal createfile %file %path
}
on *:signal:createfile:{
  noop $hget(bin,test,&out)
  var %f = $qt($+($2-,$ticks,.,$1))

  echo -a *** Makenew: Writing to: %f
  bwrite %f 0 -1 &out

  hfree bin
}


Edit: I've tested this on a 100mb video as well. Works fine. So I'm not sure why you're thinking there's a file size limit. As far as I know, hash table binary size is limited -only- by memory.

Last edited by Rand; 04/10/07 10:09 PM.