mIRC Home    About    Download    Register    News    Help

Print Thread
Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
i try to make 10000 size of hash table so that i can store 100000 items for $findfile but failed. so that, does it mean the size is limited to 9999 only?

Joined: Aug 2004
Posts: 7,252
R
Hoopy frood
Offline
Hoopy frood
R
Joined: Aug 2004
Posts: 7,252
Remember that hash tables are stored in RAM, so the amount of RAM you have is also a factor.

With 216M of available RAM, I was able to make and fill a hash table set at 10,000 (100,000 entries) with the entries being of random lengths between 10 and 200 characters.

Note: This appears to have caused my available RAM to drop to 100M.

Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
If you give us your code, perhaps we can help further.


Invision Support
#Invision on irc.irchighway.net
Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
the ram is a factor, then i have 1300MB free ram but why cant make a table with 20000 size? //hmake -s sample 20000

* /hmake: invalid parameters

but i can create a table with 10000 size.

Joined: Dec 2002
Posts: 2,962
S
Hoopy frood
Offline
Hoopy frood
S
Joined: Dec 2002
Posts: 2,962
The highest number of slots (the N parameter to /hmake) you can assign for a hash table is 10000. Regardless of that there is no limit on the number of items that can be stored in a hash table. The overall size of a hash table will be limited by system memory, but that doesn't affect the slot limit in any way.

If you want an explanation of what the slots parameter actually is and how it affects the efficiency of a hash table in relation to the number of items, see this post.


Spelling mistakes, grammatical errors, and stupid comments are intentional.
Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
well, thanks for those replies. now i got a problem to implement the $findfile command with hash tables.

$findfile(dir,wildcard,N,depth,@window | command)

sigh, i have tried with my limited scripting knowledge and failed... can someone demo the codes?

Joined: Jun 2006
Posts: 508
D
Fjord artisan
Offline
Fjord artisan
D
Joined: Jun 2006
Posts: 508
Would need to know what you plan to use as the item, and what as the data. Basically, its going to be $findfile(dir,filemask,N,[depth,]hadd table item data)

Joined: Aug 2004
Posts: 7,252
R
Hoopy frood
Offline
Hoopy frood
R
Joined: Aug 2004
Posts: 7,252
Note that if you plan to use $1- as the data, it will be the full name of the file found, including the directory information.

If you were wanting to use $1- for the item, then you're only going to get up to the first space character. Putting the information into quotes does not override this.

Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
Although, if you really need to have spaces in the item name, for path or filename:

Paths/Filenames, can't contain certain characters such as ">" .. So all you have to do is replace spaces with ">" and then when you plan on using that data (display it) change the > back to a space.

$replace($1-,$chr(32),>)

(This isn't just for RusselB, mind you.)

Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
sorry guys, what is item and data? can i skip items just save those as data, like /var %a = $findfile(c:,*,0,hadd sample $1-) but when testing with //echo -a $hfind(database,*files*,w), it returns 0...

crazy

Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
If you can reply to this quickly, maybe I can help you (I need to sleep real soon)

What are you trying to store? I'm not sure what you're trying to put into the hash table.

Try to be specific. I need the "What and Why" before I can really help you.

Edit:

I guess I'll just give you this since I need to sleep, not sure if it's what you want or not:

Code:
alias getfiles {
  hmake file 10000
  noop $findfile(C:\SOMEPATH\,*,0,hadd file $calc($hget(file,0).item + 1) $1-)
  echo -a Total amount of files added: $hget(file,0).item
}


Change "C:\SOMEPATH\" to the path that you want.

You may wish to specify a "depth" parameter.

Put that in your remote script, then type: /getfiles

You can then use: $hget(file,1) to get the filename of the first file. $hget(file,3) to get the third filename, etc.

Last edited by Rand; 30/09/07 08:53 AM.
Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
hello, thanks for the codes. Actually i choose $findfile is to create a virtual large database, since others than $findfile, i dun have any source again to create a large database.

Your codes works what i want, but then how to search through the hash table? let said i want to know how many dll files in my hash table, then how to implement the codes?

Joined: Dec 2002
Posts: 3,547
S
Hoopy frood
Offline
Hoopy frood
S
Joined: Dec 2002
Posts: 3,547
Code:
alias getfile {
  var %x = 1
  while (%x <= $hfind(file, $+(*.,$$1), %x, w).data) {
    echo -a $hget(file,$v2)
    inc %x
  }
  echo -a Total amount of $1 files: $hfind(file, $+(*.,$$1), 0, w).data
}


/getfile dll

Joined: Jun 2006
Posts: 508
D
Fjord artisan
Offline
Fjord artisan
D
Joined: Jun 2006
Posts: 508
This should be a LOT quicker...
Code:
alias loadfiles {
  if $hget(@files) { hfree @files }
  close -@ @files
  window -hl @files
  noop $findfile(C:\,*,0,@files)
  savebuf @files @files.txt
  close -@ @files
  hmake @files 10000
  hload -n @files @files.txt
  .remove @files.txt
}


Originally Posted By: swgiant
Your codes works what i want, but then how to search through the hash table? let said i want to know how many dll files in my hash table, then how to implement the codes?

Code:
alias getfile {
  ; $getfile(filespec,N)
  if $2 == 0 { return $hfind(@files,$+(*,$1,*),0,w).data }
  elseif $2 isnum && $hfind(@files,$+(*,$1,*),$2,w).data { return $hget(@files,$v1) }
}

//echo -a $getfile(.dll,0)
//echo -a $getfile(.dll,1245)

Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
thanks to the replies.. now i have better picture about the hash table, although i dun have the database like SQL but still enjoying to retrieve the file from the hash tables...

Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
oh yes, i want to ask that since the size is limited to 10000 as in short, it can store up to 100000 items but what if the items exceed more than 100000 items then how hash table will handle it? will it crash?

Joined: Dec 2002
Posts: 2,962
S
Hoopy frood
Offline
Hoopy frood
S
Joined: Dec 2002
Posts: 2,962
No, a hash table can hold any number of items regardless of the number of slots, it just becomes less efficient at accessing each item the further you go over the slot count.

When using $hfind() though the slot count has no bearing on efficiency at all, so the getfile alias you've been given will work exactly the same no matter how many items you put in the hash table.


Spelling mistakes, grammatical errors, and stupid comments are intentional.
Joined: Jan 2005
Posts: 41
S
swgiant Offline OP
Ameglian cow
OP Offline
Ameglian cow
S
Joined: Jan 2005
Posts: 41
thanks, to make me clear...


Link Copied to Clipboard