mIRC Home    About    Download    Register    News    Help

Print Thread
Page 1 of 2 1 2
#133249 18/10/05 12:18 PM
Joined: Nov 2004
Posts: 148
D
Vogon poet
OP Offline
Vogon poet
D
Joined: Nov 2004
Posts: 148
/hdel -w hash_table something*
Will delete all items which has something* wildcarded.

I was wondering if its possible of doing the same thing with data (so it will delete all data which has something* wildcarded). Maybe doing it with a trick (by using /hsave and /filter or something)

Thanks.

I'm currently want to remove all items which has something* in thier data, but I have to use a while loop which scans the table using hfind (in order to get the item) and then I delete it.
This I must say is very slow due to the use of While loops.

Thanks again

#133250 18/10/05 02:18 PM
Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
I know: /hdel -w <hash table> <item> <data>*
doesn't work. (obvious I know)

However, if you know which items are going to have something* in it, you could put these on a timer/unset mode. ie:

hadd -mu50000 <table> <item> <data>

Now, since these are on a unset timer, they will not save to a file when you /hsave (unless you specify -u).

So if you needed to get rid of the something*, and you put them on a unset timer, you could /hsave -o, then /hdel <hashtable> *, and then /hload the information.

( Instead of using the standard /hadd command in your script, you could write a seperate alias like so:
Code:
alias _hadd {
var %table = $1 , %item = $2 , %data = $3-
if (something* iswm %data) { /hadd -mu50000 %table %item %data }
else { /hadd -m %table %item %data }
}


Just use that to add stuff to your hash tables, it'll automatically detect the "something*" and put it in the unset list.

Hope this helps. Good luck.

#133251 18/10/05 02:25 PM
Joined: Sep 2005
Posts: 2,881
H
Hoopy frood
Offline
Hoopy frood
H
Joined: Sep 2005
Posts: 2,881
I'm thinking that dumping the hashtable to disk, deleting it and then reloading is going to be somewhat slower than the solution he had in the first place. Worth a shot though I suppose.

#133252 18/10/05 02:39 PM
Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
That depends on the size. Saving/Loading a hash table is very quick, even with a large hash table. A while loop, on the otherhand, will be much slower if the table is large. Still, I have a feeling he probably doesn't know the items when he makes the table and that he probably wants to keep it saved with data until he does choose to remove things. Using filter on a saved table probably would be much faster.


Invision Support
#Invision on irc.irchighway.net
#133253 18/10/05 03:19 PM
Joined: Jan 2003
Posts: 2,523
Q
Hoopy frood
Offline
Hoopy frood
Q
Joined: Jan 2003
Posts: 2,523
It would be faster indeed, if it could be applied. Unfortunately, that's not possible because mirc separates the items from the data with LF's, so the saved file looks to /filter like this:

item1
data1
item2
data2
item3
data3
....

Things aren't better if you /hsave with the -b switch either: mirc uses 04 00 and 05 00 to separate items from data.

If only $regsub supported binvars...


/.timerQ 1 0 echo /.timerQ 1 0 $timer(Q).com
#133254 18/10/05 03:24 PM
Joined: Nov 2004
Posts: 148
D
Vogon poet
OP Offline
Vogon poet
D
Joined: Nov 2004
Posts: 148
Yes, when I'm adding the items I don't know that I want to remove them, and some of them are being updated.

Unfortunatly, I can't use /filter, since I need to clean the hash table.

If I save the hash table, and I use /filter then it won't work. After saving the data, the items apears in odd lines and the data in even lines. If I will use /filter, I will only remove the data, and not the items which are related to it.

confused frown

#133255 18/10/05 03:31 PM
Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
I just used the -u50000 as an example.. could lengthen it if need be to compensate. One more zero would add another 5 days. But, that was more of a "This is an option" type thing..

Completely depends on what the script is for. If the hash table items don't really need to be saved upon mIRC's exit and what not, then this won't really be a problem. One can always specify the -u option with /hsave to keep everything.

/filter might do the trick, though I haven't actually played with that much myself.

Edit: Lots of new posts that I didn't have the chance to read before I posted this..

Last edited by Rand; 18/10/05 03:35 PM.
#133256 18/10/05 04:01 PM
Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
Solution #2.

/hsave -i <stuff> testing_hash

Code:
alias del_line {
  var %i = 0 , %file = testing.ini , %section = testing_hash
  while ($read(%file, r, /^[^=]+=.*delete.*/, %i)) {
    var %v1 = $v1
    if ($regex(%v1,/^([^=]+)=.+/)) { var %here = $regml(1) }
    remini %file %section %here
    var %i = $calc($readn + 1)
  }
}


That's about as quick as it's going to get without looping through the entire file. You can modify the alias, and change the regex part of the $read() to whatever you need it.

I tested this using:

/hadd -m testing one keep this one
/hadd -m testing two should delete this
/hadd -m testing three we'll keep this one, it's neat.
/hadd -m testing four this one isn't as cool as three, let's delete it.

/hsave -i testing.ini testing_hash
/hdel -w testing *
/del_line
/hload -i testing testing.ini testing_hash


Just in case some of you would like to play around with it.

Edit: On second thought, the code can be simplified to:
Code:
alias del_line {
  var %i = 0 , %file = testing.ini , %section = testing_hash
  while ($read(%file, r, /^([^=]+)=.*delete.*)/, %i)) {
    remini %file %section $regml(1)
    var %i = $calc($readn + 1)
  }
}

Last edited by Rand; 18/10/05 04:14 PM.
#133257 18/10/05 04:28 PM
Joined: Nov 2004
Posts: 148
D
Vogon poet
OP Offline
Vogon poet
D
Joined: Nov 2004
Posts: 148
No one even talked about saving the hash table.
I need to delete the items in real time. When I'm adding the data to the hash table, I have no idea that I'm going to remove it sometime and I have no idea which will be removed and when.

When I do decide to remove data, I say something like... lets remove all items which thier data contains the word Hello (just an example).

Your solution is badly coded.... you take a hash table, save it to a file, and scan it using a while loop, and delete the data from the file!!!!! This is the slowest way to do it.
I'm currently doing the same thing on the hash table itself using a while loop too, and it takes ages because of it. From expiriance, working with files is much slower and take more resources.

If it could be done without any While loops than it would takes few seconds at most.....

#133258 18/10/05 04:44 PM
Joined: Feb 2004
Posts: 2,019
Hoopy frood
Offline
Hoopy frood
Joined: Feb 2004
Posts: 2,019
When saving a hash table to a file with the -i switch, so that it is saved in .ini file format, any item that contains an [ or ] will be corrupted, as those characters are changed to a ~ tilde. When hloading the table, the tilde does not revert to its original state, and you cannot know if it was a [ or a ].

One way for him to go around that, is to $encode his items.


Gone.
#133259 18/10/05 04:45 PM
Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
Uh, just to point out a few things you've said.

Quote:
Maybe doing it with a trick (by using /hsave and /filter or something)


Quote:
No one even talked about saving the hash table.


Quote:
I'm currently want to remove all items which has something* in thier data, but I have to use a while loop which scans the table using hfind (in order to get the item) and then I delete it.


So I'm quite aware of what you *were* trying. And I offered other ways to do it. You said maybe with an /hsave and /filter trick. So I looked into /hsave.

Now, since I've never messed with filter, you've made me look into it within the time I wrote this post.

/hsave -io hash_table_name file.ini section
/hdel -w hash_table_name *
/filter -xff file.ini newfile.ini *delete*
/hload -i hash_table_name newfile.ini section

This will exclude any lines containing "delete" (so make sure your item names don't match the stuff you want to delete in the data)


Happy?

Edit: Sheesh.. typoes galore..

Last edited by Rand; 18/10/05 04:53 PM.
#133260 18/10/05 04:54 PM
Joined: Apr 2003
Posts: 701
K
Hoopy frood
Offline
Hoopy frood
K
Joined: Apr 2003
Posts: 701
It might be a good idea to check if your data structure is suitable for what you want to do. Since you don't give any real examples, it's impossible to suggest a better alternative.
Maybe switching data and keys might help, since you can then just use /hdel -w.

#133261 18/10/05 04:57 PM
Joined: Nov 2004
Posts: 148
D
Vogon poet
OP Offline
Vogon poet
D
Joined: Nov 2004
Posts: 148
I didn't mean to insult you or something.....

I thought you meant something else with the saving the files (Now when I read your post again, I understand that you meant save and then load, which will make the -u items to be deleted). Anyway, it is not possible since I don't know what I'm going to delete.


FiberOPtics has wrote a point why your new idea can't work, but it is still a nice idea (even though I'm not sure that it will work with huge hash table due to the 64KB ini file limit).

Thank you Rand.

#133262 18/10/05 04:59 PM
Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
I don't think any current way will work very well (fast) for what you want to do. Obviously a new feature for /hdel would make it faster, but that won't be added until at least the next version, which could be months down the road. Perhaps a DLL could be made to make it faster... I'm not sure.

Anyhow, because it is slower than you want and a new feature could take months or years before it's added, you may want to consider changing how you're dealing with data. Obviously, your current data setup isn't appropriate for what you want to do with it. Hash tables are great for many things, but aren't always the best option. And, sometimes just changing the hash table format can help out. As I have no idea what you're really using the table for, or how you have it formatted, I can't suggest anything to do with it. I also can't be certain that there even is a better way to deal with the data. I'm just offering this as a suggestion. smile


Invision Support
#Invision on irc.irchighway.net
#133263 18/10/05 05:03 PM
Joined: Nov 2004
Posts: 148
D
Vogon poet
OP Offline
Vogon poet
D
Joined: Nov 2004
Posts: 148
Quote:
It might be a good idea to check if your data structure is suitable for what you want to do. Since you don't give any real examples, it's impossible to suggest a better alternative.
Maybe switching data and keys might help, since you can then just use /hdel -w.


How can I switch between data and items after the data was already written??
Your idea sounds nice since it has a potential (it the switching is possible).

#133264 18/10/05 05:07 PM
Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
It's not really a problem. shocked

I'm just cranky since I haven't slept yet >.> 12pm..

But as Kelder said, it's kind of hard to tell exactly what someone wants without having any knowlege or examples of how the script functions.

#133265 18/10/05 05:11 PM
Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
If you can give us perhaps 5-10 items with data that you're working with (you can always edit out any personal information if you don't want us to see it), we can probably offer better options (including how you might be able to swap data with item names, even after it was already added to the hash table. The initial swapping may take a good amount of time on a large table, but afterwards wouldn't be needed at all since you can change the saving of data to use the new format.

Oh, although using "/hsave -i" will corrupt item names that have []'s, that's only a problem if you have item names that will have []'s. If you don't, that's not an issue. I'm unsure about the INI size limit for /hload. Variables are loaded fine over the 64k limit, but $readini can't... so, it's possible that /hload may or may not be able to load past that 64k limit. Only way to know is to test. laugh

Last edited by Riamus2; 18/10/05 05:14 PM.

Invision Support
#Invision on irc.irchighway.net
#133266 18/10/05 05:20 PM
Joined: Oct 2004
Posts: 8,330
Hoopy frood
Offline
Hoopy frood
Joined: Oct 2004
Posts: 8,330
Ok, I tested /hsave -i ... you can save/load ini files larger than 64k. I tested 107k and it worked fine. That may be an option for you if you don't want to mess with your data structure and if you don't use []'s in the item names.


Invision Support
#Invision on irc.irchighway.net
#133267 18/10/05 05:44 PM
Joined: Feb 2005
Posts: 342
R
Fjord artisan
Offline
Fjord artisan
R
Joined: Feb 2005
Posts: 342
Heh, I've been using ini files over 64k for a long time now. Had to do that ages ago. *used to save everything to ini files*

Note, that /writeini says "-n switch will attempt to write to the .ini file even if it is larger than 64k" It's been like that for quite a while blush

Edit: Figure I'd go ahead and tell you guys why I had ini files that big. Ages ago, when I attempted to make my first "Ragnarok Online mIRC Database" I was using ini files. Shortly after I realised that was a gigantic lagfest when looping through monster info, and moved to SQLite.dll

Last edited by Rand; 18/10/05 05:46 PM.
#133268 18/10/05 05:49 PM
Joined: Sep 2003
Posts: 4,230
D
Hoopy frood
Offline
Hoopy frood
D
Joined: Sep 2003
Posts: 4,230
Can i ask on average how often this would need to be done?
Also per time how many items your talking about liekly need to be removed?
And how big is the hash table in total?

have you tried optamizing your hash table removale loop?
ex..
while ($hfind(hash_table,something *,1,w).data) { hdel hash_table $v1 }
or even....
while 1 { hdel hash_table $hfind(hash_table,something *,1,w).data } | :error | reseterror

Page 1 of 2 1 2

Link Copied to Clipboard