mIRC Homepage
Posted By: Dr_Brom_sung_ /hdel data... - 18/10/05 12:18 PM
/hdel -w hash_table something*
Will delete all items which has something* wildcarded.

I was wondering if its possible of doing the same thing with data (so it will delete all data which has something* wildcarded). Maybe doing it with a trick (by using /hsave and /filter or something)

Thanks.

I'm currently want to remove all items which has something* in thier data, but I have to use a while loop which scans the table using hfind (in order to get the item) and then I delete it.
This I must say is very slow due to the use of While loops.

Thanks again
Posted By: Rand Re: /hdel data... - 18/10/05 02:18 PM
I know: /hdel -w <hash table> <item> <data>*
doesn't work. (obvious I know)

However, if you know which items are going to have something* in it, you could put these on a timer/unset mode. ie:

hadd -mu50000 <table> <item> <data>

Now, since these are on a unset timer, they will not save to a file when you /hsave (unless you specify -u).

So if you needed to get rid of the something*, and you put them on a unset timer, you could /hsave -o, then /hdel <hashtable> *, and then /hload the information.

( Instead of using the standard /hadd command in your script, you could write a seperate alias like so:
Code:
alias _hadd {
var %table = $1 , %item = $2 , %data = $3-
if (something* iswm %data) { /hadd -mu50000 %table %item %data }
else { /hadd -m %table %item %data }
}


Just use that to add stuff to your hash tables, it'll automatically detect the "something*" and put it in the unset list.

Hope this helps. Good luck.
Posted By: hixxy Re: /hdel data... - 18/10/05 02:25 PM
I'm thinking that dumping the hashtable to disk, deleting it and then reloading is going to be somewhat slower than the solution he had in the first place. Worth a shot though I suppose.
Posted By: Riamus2 Re: /hdel data... - 18/10/05 02:39 PM
That depends on the size. Saving/Loading a hash table is very quick, even with a large hash table. A while loop, on the otherhand, will be much slower if the table is large. Still, I have a feeling he probably doesn't know the items when he makes the table and that he probably wants to keep it saved with data until he does choose to remove things. Using filter on a saved table probably would be much faster.
Posted By: qwerty Re: /hdel data... - 18/10/05 03:19 PM
It would be faster indeed, if it could be applied. Unfortunately, that's not possible because mirc separates the items from the data with LF's, so the saved file looks to /filter like this:

item1
data1
item2
data2
item3
data3
....

Things aren't better if you /hsave with the -b switch either: mirc uses 04 00 and 05 00 to separate items from data.

If only $regsub supported binvars...
Posted By: Dr_Brom_sung_ Re: /hdel data... - 18/10/05 03:24 PM
Yes, when I'm adding the items I don't know that I want to remove them, and some of them are being updated.

Unfortunatly, I can't use /filter, since I need to clean the hash table.

If I save the hash table, and I use /filter then it won't work. After saving the data, the items apears in odd lines and the data in even lines. If I will use /filter, I will only remove the data, and not the items which are related to it.

confused frown
Posted By: Rand Re: /hdel data... - 18/10/05 03:31 PM
I just used the -u50000 as an example.. could lengthen it if need be to compensate. One more zero would add another 5 days. But, that was more of a "This is an option" type thing..

Completely depends on what the script is for. If the hash table items don't really need to be saved upon mIRC's exit and what not, then this won't really be a problem. One can always specify the -u option with /hsave to keep everything.

/filter might do the trick, though I haven't actually played with that much myself.

Edit: Lots of new posts that I didn't have the chance to read before I posted this..
Posted By: Rand Re: /hdel data... - 18/10/05 04:01 PM
Solution #2.

/hsave -i <stuff> testing_hash

Code:
alias del_line {
  var %i = 0 , %file = testing.ini , %section = testing_hash
  while ($read(%file, r, /^[^=]+=.*delete.*/, %i)) {
    var %v1 = $v1
    if ($regex(%v1,/^([^=]+)=.+/)) { var %here = $regml(1) }
    remini %file %section %here
    var %i = $calc($readn + 1)
  }
}


That's about as quick as it's going to get without looping through the entire file. You can modify the alias, and change the regex part of the $read() to whatever you need it.

I tested this using:

/hadd -m testing one keep this one
/hadd -m testing two should delete this
/hadd -m testing three we'll keep this one, it's neat.
/hadd -m testing four this one isn't as cool as three, let's delete it.

/hsave -i testing.ini testing_hash
/hdel -w testing *
/del_line
/hload -i testing testing.ini testing_hash


Just in case some of you would like to play around with it.

Edit: On second thought, the code can be simplified to:
Code:
alias del_line {
  var %i = 0 , %file = testing.ini , %section = testing_hash
  while ($read(%file, r, /^([^=]+)=.*delete.*)/, %i)) {
    remini %file %section $regml(1)
    var %i = $calc($readn + 1)
  }
}
Posted By: Dr_Brom_sung_ Re: /hdel data... - 18/10/05 04:28 PM
No one even talked about saving the hash table.
I need to delete the items in real time. When I'm adding the data to the hash table, I have no idea that I'm going to remove it sometime and I have no idea which will be removed and when.

When I do decide to remove data, I say something like... lets remove all items which thier data contains the word Hello (just an example).

Your solution is badly coded.... you take a hash table, save it to a file, and scan it using a while loop, and delete the data from the file!!!!! This is the slowest way to do it.
I'm currently doing the same thing on the hash table itself using a while loop too, and it takes ages because of it. From expiriance, working with files is much slower and take more resources.

If it could be done without any While loops than it would takes few seconds at most.....
Posted By: FiberOPtics Re: /hdel data... - 18/10/05 04:44 PM
When saving a hash table to a file with the -i switch, so that it is saved in .ini file format, any item that contains an [ or ] will be corrupted, as those characters are changed to a ~ tilde. When hloading the table, the tilde does not revert to its original state, and you cannot know if it was a [ or a ].

One way for him to go around that, is to $encode his items.
Posted By: Rand Re: /hdel data... - 18/10/05 04:45 PM
Uh, just to point out a few things you've said.

Quote:
Maybe doing it with a trick (by using /hsave and /filter or something)


Quote:
No one even talked about saving the hash table.


Quote:
I'm currently want to remove all items which has something* in thier data, but I have to use a while loop which scans the table using hfind (in order to get the item) and then I delete it.


So I'm quite aware of what you *were* trying. And I offered other ways to do it. You said maybe with an /hsave and /filter trick. So I looked into /hsave.

Now, since I've never messed with filter, you've made me look into it within the time I wrote this post.

/hsave -io hash_table_name file.ini section
/hdel -w hash_table_name *
/filter -xff file.ini newfile.ini *delete*
/hload -i hash_table_name newfile.ini section

This will exclude any lines containing "delete" (so make sure your item names don't match the stuff you want to delete in the data)


Happy?

Edit: Sheesh.. typoes galore..
Posted By: Kelder Re: /hdel data... - 18/10/05 04:54 PM
It might be a good idea to check if your data structure is suitable for what you want to do. Since you don't give any real examples, it's impossible to suggest a better alternative.
Maybe switching data and keys might help, since you can then just use /hdel -w.
Posted By: Dr_Brom_sung_ Re: /hdel data... - 18/10/05 04:57 PM
I didn't mean to insult you or something.....

I thought you meant something else with the saving the files (Now when I read your post again, I understand that you meant save and then load, which will make the -u items to be deleted). Anyway, it is not possible since I don't know what I'm going to delete.


FiberOPtics has wrote a point why your new idea can't work, but it is still a nice idea (even though I'm not sure that it will work with huge hash table due to the 64KB ini file limit).

Thank you Rand.
Posted By: Riamus2 Re: /hdel data... - 18/10/05 04:59 PM
I don't think any current way will work very well (fast) for what you want to do. Obviously a new feature for /hdel would make it faster, but that won't be added until at least the next version, which could be months down the road. Perhaps a DLL could be made to make it faster... I'm not sure.

Anyhow, because it is slower than you want and a new feature could take months or years before it's added, you may want to consider changing how you're dealing with data. Obviously, your current data setup isn't appropriate for what you want to do with it. Hash tables are great for many things, but aren't always the best option. And, sometimes just changing the hash table format can help out. As I have no idea what you're really using the table for, or how you have it formatted, I can't suggest anything to do with it. I also can't be certain that there even is a better way to deal with the data. I'm just offering this as a suggestion. smile
Posted By: Dr_Brom_sung_ Re: /hdel data... - 18/10/05 05:03 PM
Quote:
It might be a good idea to check if your data structure is suitable for what you want to do. Since you don't give any real examples, it's impossible to suggest a better alternative.
Maybe switching data and keys might help, since you can then just use /hdel -w.


How can I switch between data and items after the data was already written??
Your idea sounds nice since it has a potential (it the switching is possible).
Posted By: Rand Re: /hdel data... - 18/10/05 05:07 PM
It's not really a problem. shocked

I'm just cranky since I haven't slept yet >.> 12pm..

But as Kelder said, it's kind of hard to tell exactly what someone wants without having any knowlege or examples of how the script functions.
Posted By: Riamus2 Re: /hdel data... - 18/10/05 05:11 PM
If you can give us perhaps 5-10 items with data that you're working with (you can always edit out any personal information if you don't want us to see it), we can probably offer better options (including how you might be able to swap data with item names, even after it was already added to the hash table. The initial swapping may take a good amount of time on a large table, but afterwards wouldn't be needed at all since you can change the saving of data to use the new format.

Oh, although using "/hsave -i" will corrupt item names that have []'s, that's only a problem if you have item names that will have []'s. If you don't, that's not an issue. I'm unsure about the INI size limit for /hload. Variables are loaded fine over the 64k limit, but $readini can't... so, it's possible that /hload may or may not be able to load past that 64k limit. Only way to know is to test. laugh
Posted By: Riamus2 Re: /hdel data... - 18/10/05 05:20 PM
Ok, I tested /hsave -i ... you can save/load ini files larger than 64k. I tested 107k and it worked fine. That may be an option for you if you don't want to mess with your data structure and if you don't use []'s in the item names.
Posted By: Rand Re: /hdel data... - 18/10/05 05:44 PM
Heh, I've been using ini files over 64k for a long time now. Had to do that ages ago. *used to save everything to ini files*

Note, that /writeini says "-n switch will attempt to write to the .ini file even if it is larger than 64k" It's been like that for quite a while blush

Edit: Figure I'd go ahead and tell you guys why I had ini files that big. Ages ago, when I attempted to make my first "Ragnarok Online mIRC Database" I was using ini files. Shortly after I realised that was a gigantic lagfest when looping through monster info, and moved to SQLite.dll
Posted By: DaveC Re: /hdel data... - 18/10/05 05:49 PM
Can i ask on average how often this would need to be done?
Also per time how many items your talking about liekly need to be removed?
And how big is the hash table in total?

have you tried optamizing your hash table removale loop?
ex..
while ($hfind(hash_table,something *,1,w).data) { hdel hash_table $v1 }
or even....
while 1 { hdel hash_table $hfind(hash_table,something *,1,w).data } | :error | reseterror
Posted By: Riamus2 Re: /hdel data... - 18/10/05 06:02 PM
Yes, I know writeini will work. It's the $read that I thought was a problem with larger files... though I admit I've never tried as I don't need ini files that are that large.
Posted By: DaveC Re: /hdel data... - 18/10/05 06:28 PM
replied to you only becuase yours was the last item mentioning /hsave -i

I think everyone should just forget the -i option, try doing a tick test on saving the file with -i compared to without it, I tried on a 10,000 item table, and OMG but does it take a long time. While a 10,000 item /HSAVE hash_table temp.txt took 16 or 32 ticks (ie: an unacuratly small time).
Posted By: Riamus2 Re: /hdel data... - 18/10/05 07:21 PM
Yeah, I did notice it took a little time for even that 107k test file. I guess it's really a matter of what is faster (probably his current while loop method), and whether or not he wants to consider changing how he's handling/storing the data so that it may work without dealing with this problem.
Posted By: DaveC Re: /hdel data... - 18/10/05 09:32 PM
ok here ya go try this out....
Code:
;Usage hdel.wilddata -bN &lt;name&gt; &lt;matchtext&gt;
;
; Deletes hashtable Items who's data matches the matchtext given.
;
; -bN : N being a number 1 or greater (default 10),
;       N represents the point where a simple loop is no longer used to remove matched data items
;       ie: if N is 5 and there are only 3 data matched items then a simple loop is used
;           if N is 10 and there are 15 data matched items then the hsave/filter method is used.
;
alias hdel.wilddata {
  ;
  ; Pharse command line phase
  ;
  var %breakpoint = 10
  if (-* !iswm $1) {
    var %hdel.wilddata.hashtable = $1
    var %hdel.wilddata.matchtext = $2-
  }
  else {
    if ($calc($mid($1,$calc($poscs($1,b) + 1))) isnum 1-) { var %breakpoint = $v1 }
    var %hdel.wilddata.hashtable = $2
    var %hdel.wilddata.matchtext = $3-
  }
  ;
  ; Do the hdel phase
  ;
  if (((%hdel.wilddata.hashtable != $null) &amp;&amp; (%hdel.wilddata.matchtext != $null)) &amp;&amp; ($hfind(%hdel.wilddata.hashtable,%hdel.wilddata.matchtext,0,w).data)) {
    if ($v1 &gt; 0) {
      if ($v1 == $hget(%hdel.wilddata.hashtable,0).item) {
        ;
        ; Matchtext matched entire table 
        ;
        hdel -w %hdel.wilddata.hashtable *
      }
      elseif ($v1 &lt; %breakpoint) {
        ;
        ; Simple loop method
        ;
        while ($hfind(%hdel.wilddata.hashtable,%hdel.wilddata.matchtext,1,w).data) { hdel %hdel.wilddata.hashtable $v1 }
      }
      else {
        ;
        ; Hsave/filter method
        ;
        set %hdel.wilddata.hashtable %hdel.wilddata.hashtable
        hsave -ou %hdel.wilddata.hashtable hdel.wilddata.tempfile.txt
        window -hn @hdel.wilddata.hidden.window
        loadbuf -r @hdel.wilddata.hidden.window hdel.wilddata.tempfile.txt
        hsave -oun %hdel.wilddata.hashtable hdel.wilddata.tempfile.txt
        filter -fkn hdel.wilddata.tempfile.txt hdel.wilddata.filter.alias %hdel.wilddata.matchtext
        window -c @hdel.wilddata.hidden.window
        unset %hdel.wilddata.hashtable
      }
    }
  }
}
alias -l hdel.wilddata.filter.alias { hdel %hdel.wilddata.hashtable $line(@hdel.wilddata.hidden.window,$calc(-1 + 2 * $1)) }


it uses 3 possable methods
(1) if it notices you wildcarded the whole hashtabe, it just erases the whole table
(2) It uses a simple loop if there isnt many to do, default below 10 to do, or -bN and u can set that number.
(3) other wise it uses a Hsave & Filter combo

If anyone is interested but cant follow it i can explain.
Posted By: Dr_Brom_sung_ Re: /hdel data... - 19/10/05 07:53 AM
Ok, thanks DaveC, youa re the man...
The process is not called often, but when it does, it has to remove let say like 100 items or more from a 5000 items tables (more or less). This is the reason that a while loop takes ages.

I'm going to try some different methods. From what I have tested so far, using goto works 2 times faster than doing the same with a while (I have no idea why).

I'm going to try doing it with a 0 0 timer that will call a function that will remove each time the first line it finds until no lines are found, and then it stops the timer. I will then check how many ticks it takes..... :tongue:
Posted By: FiberOPtics Re: /hdel data... - 19/10/05 07:57 AM
Quote:
Ok, thanks DaveC, youa re the man...

Dave Dave, he's our man, if he can't do it, no one can! grin
Posted By: Dr_Brom_sung_ Re: /hdel data... - 19/10/05 08:54 AM
Ok, Using a while loop takes exactly twince as much as it takes doing the same loop with Goto which breaks when the hash table is cleaned from the data I want.

It also appears that /timer 0 0 , is as fast as using goto.

I'm going to try the idea of DaveC and see if it can speed things up a little, or if it works well with my data.
Posted By: DaveC Re: /hdel data... - 19/10/05 09:02 AM
mahahahahaha

/me gets his Dave Flag out and starts waving it, then wonders why everyone is rolling there eyes ?¿?¿?¿?¿?¿
Posted By: Dr_Brom_sung_ Re: /hdel data... - 19/10/05 09:37 AM
Dave, your method of using /filter is working amazingly fast. WOW!!! grin


Thanks.
© mIRC Discussion Forums