mIRC Home    About    Download    Register    News    Help

Print Thread
#207565 22/12/08 11:09 PM
Joined: Mar 2008
Posts: 47
N
nok Offline OP
Ameglian cow
OP Offline
Ameglian cow
N
Joined: Mar 2008
Posts: 47
I am creating a protection for my server ...
I have a listing in a txt, the problem is that there are several repeated, how could delete them?

format of the list
000.000.000.000:0000

Last edited by nok; 22/12/08 11:31 PM.
nok #207569 23/12/08 12:08 AM
Joined: Aug 2004
Posts: 7,252
R
Hoopy frood
Offline
Hoopy frood
R
Joined: Aug 2004
Posts: 7,252
1) Call up the text file in a text editor (Notepad, Wordpad, WordPerfect, Microsoft Word, etc.) and manually delete the lines that you don't want, then re-save the file when done.

2) Use the /write command with the -d switch to delete the duplicate lines.

See /help /write

Personally, I would recommend the first option, as you will be able to see each change as it is done, then save the file only once, rather than having the file be re-saved with the deleted line each time.

nok #207579 23/12/08 05:44 AM
Joined: Mar 2008
Posts: 47
N
nok Offline OP
Ameglian cow
OP Offline
Ameglian cow
N
Joined: Mar 2008
Posts: 47
How could display only the number of ip?

$gettok($gettok($1-,-1,91),1,93)

Works in the first instance but not in the second

Example 1
Nick1 (-@host171.190-30-83.host.net => Host-71DFF749.host.net) (asd) [xxx.xx.xx.xxx] connected to the network

Example 2
Nick (-@109-243-231-201.host.com => Host-247F741D.host.com) (xxcap[8.32] • wxw.xxcap.com) [xxx.xxx.xxx.xxx] connected to the network

nok #207580 23/12/08 06:13 AM
Joined: Jun 2007
Posts: 933
5
Hoopy frood
Offline
Hoopy frood
5
Joined: Jun 2007
Posts: 933
$gettok($gettok($1-,-1,91),1,93)

*notices you either already used -1 or changed your post*

Anyway, that works fine.

Last edited by 5618; 23/12/08 06:16 AM.
nok #207584 23/12/08 10:26 AM
Joined: Jun 2008
Posts: 48
T
Ameglian cow
Offline
Ameglian cow
T
Joined: Jun 2008
Posts: 48
I'm thinking that your list could be pretty long and sorting a list looking for duplicates manually is both very time consuming and can lead to missed duplicates so the first idea that comes to mind is a couple while loops to scan the file for matching lines.

I did briefly test it using:
Quote:
000.000.000.000:0000
111.111.111.111:1111
222.222.222.222:2222
333.333.333.333:3333
111.111.111.111:1111
222.222.222.222:2222
000.000.000.000:0000
111.111.111.111:1111
222.222.222.222:2222
000.000.000.000:0000
111.111.111.111:1111
222.222.222.222:2222
000.000.000.000:0000
111.111.111.111:1111
222.222.222.222:2222
000.000.000.000:0000
111.111.111.111:1111
222.222.222.222:2222
333.333.333.333:3333

And the file was left with only:
Quote:
000.000.000.000:0000
111.111.111.111:1111
222.222.222.222:2222
333.333.333.333:3333

The code I tossed together is:
Code:
alias sortlist {
  var %v1 1
  while (%v1 !> $lines(1.txt)) {
    var %v2 $calc(%v1 + 1)
    while (%v2 !> $lines(1.txt)) {
      if ($read(1.txt,%v1) == $read(1.txt,%v2)) { write -dl $+ %v2 1.txt }
      else { inc %v2 }
    }
    inc %v1
  }
}

It's very simple really, it just goes through the file a line at a time comparing the current line to all other lines below it.

Good luck.


I've gone to look for myself. If I should return before I get back, please keep me here.
nok #207587 23/12/08 02:24 PM
Joined: Sep 2007
Posts: 109
K
Vogon poet
Offline
Vogon poet
K
Joined: Sep 2007
Posts: 109
Another option
Code:
alias sortlist {
  var %v1 = 1 
  while (%v1 <= $lines(1.txt)) {
    write -s $read(1.txt,%v1) proxy.txt $read(1.txt,%v1)
    inc %v1
  }
  write -c 1.txt
}



Link Copied to Clipboard