mIRC Home    About    Download    Register    News    Help

Print Thread
#8797 29/01/03 08:12 PM
Joined: Jan 2003
Posts: 3
C
Self-satisified door
OP Offline
Self-satisified door
C
Joined: Jan 2003
Posts: 3
I have a folder...

This folder contains (multiple) *.txt files...

These *.txt files 'may' contain text that I would like to search for when I connect to my server. (eg: sample.text)

If the specified (sample.text) text was found during a search, I would like to echo a response to a particular #channel --- 'Sample.Text Found!'


Is this possible? To search the contents of a 'folder' for certain 'text' that may be contained within multiple *.txt documents?

If this is possible, I would also like to incorporate this search into an 'on connect' script. Nothing fancy, the simpler the better.

A final note: The 'Folder' automatically creates 'new' *.txt files daily with the file names corresponding to the date created. Eg: 030129.txt . If need be, I could settle for searching through the most recently created *.txt file within that folder on connect.

Thanks.

#8798 29/01/03 09:13 PM
Joined: Dec 2002
Posts: 395
M
Fjord artisan
Offline
Fjord artisan
M
Joined: Dec 2002
Posts: 395
Sure it is possible, but you'd have to search all the files, one by one (with /filter or $read).
Also consider using .ini files or hash tables.

#8799 29/01/03 09:27 PM
Joined: Dec 2002
Posts: 169
J
Vogon poet
Offline
Vogon poet
J
Joined: Dec 2002
Posts: 169
You can use $findfile and $read to do this. Here is an example of something similar.
Code:
alias TextSearch {
  var %dir = $sdir($mircdir,Select A Search Directory)
  var %stext = $input(Text to Find,1)
  !.echo -q junk $findfile(%dir,*.txt,0,0,SearchFile %stext $+ $chr(1) $+ $1-)
}

alias SearchFile {
  tokenize 1 $1-
  var %n = 1
  var %line = $read($2,nw,$+(*,$1,*),%n)
  while ($readn) {
    echo 5 -st $+([,$nopath($2),:,$readn,]) %line 
    %n = $readn + 1
    var %line = $read($2,nw,$+(*,$1,*),%n) 
  }
}
Both $findfile and looping through a text file using $read can be very slow. You can avoid looping $read's if you do not need to find EVERY occurance in each file, just the first.
Code:
alias SearchFile {
  tokenize 1 $1-
  if ($read($2,nw,$+(*,$1,*)) != $null) { echo 5 -st $+([,$nopath($2),:,$readn,]) $ifmatch }
}
You can also use /filter instead of $read.

#8800 29/01/03 10:01 PM
Joined: Dec 2002
Posts: 1,922
O
Hoopy frood
Offline
Hoopy frood
O
Joined: Dec 2002
Posts: 1,922
Thanks to Jerk's example, I hope mine will be readable. this one allows you to scan only files whose date is recent than the specified days, and uses /filter instead of $read loop for faster performance.

%date is optional. you can press cancel instead of providing a number, to make it scan any file. also, it doesn't scan sub directories, as the depth parameter for $findfile is set to 1.

Code:
alias scan {
  set -u %dir $$?="which folder?"
  set -u %find $$?="wildcard text to find?"
  set -u %date $?="how many days ago?"
  window -h @scan
  !.echo -q $findfile(%dir,*.txt,0,1,check_file $1-)
  window -c @scan
}
alias -l check_file {
  if %date isnum {
    var %a
    if !$regsub($nopath($1-),^(\d{2})(\d{2})(\d{2})\..+$,\3/\2/\1,%a) {
      return
    }
    if $calc($ctime($date) - $ctime(%a)) > $calc(60 * 60 * 24 * %date) {
      return
    }
  }
  filter -fwc $+(",$1-,") @scan %find
  if $filtered {
    echo -a Found $ifmatch lines matching %find in $1-
  }
}


Link Copied to Clipboard