You mean with appending to the hash file? The only problem is that the order would be lost. Per "room", I store 1 hash entry containing the number of lines, and 2 hash entries per line, one storing the $gmt value at which the line was generated, and one storing the actual line.

Actually, ignore that. I could just keep the total number of lines in the hash file and keep on counting.

However, I should probably come around and change it so that I append the individual log entries to the end of the file(s), instead of just dumping the hash table into a file.

TBH, it's on my to-do list, and I should come around doing it one of these days. I just needed some quick and dirty way to start logging back when I started, and was planning on converting them to decent logs at a later time.

What's a proper way to write data to files, assuming I'll probably write more than one line at a time? Would the following example code be decent code, or would you recommend to do it different.

Code:
savelogs {
  .timersavelogs -io 1 60 savelogs

  ;Close the logfile if it is open for whatever reason.
  if $fopen(logfile) {
    fclose logfile
  }

  ;Check if the hash table exists.
  if $hget(Logs) {

    ;Create the logs directory if it doesn't exist.
    if !$isdir($qt($+($mircdir, logs\))) {
      mkdir $qt($+($mircdir, logs\))
    }

    ;Save the different log tables to variables, so they remain consistent as entries are deleted.
    var %logs $hfind(Logs, *_entries, 0, w)
    var %logcount 0
    while %logcount < %logs {
      var %logcount $calc(%logcount + 1)
      var %log [ $+ [ %logcount ] ] $left($hfind(Logs, *_entries, %logcount, w), -8)
    }

    ;Cycle through the different log tables.
    var %logcount 0
    while %logcount < %logs {
      var %logcount $calc(%logcount + 1)
      var %logname %log [ $+ [ %logcount ] ]

      ;Open the logfile. Check if this generates an error.
      fopen -n logfile $qt($+($mircdir, logs\, %logname, .log))
      if !$ferr {

        ;Cycle through the different log entries.
        var %entrycount 0
        while %entrycount < $hget(Logs, $+(%logname, _entries)) {
          var %entrycount $calc(%entrycount + 1)

          ;Write an entry to disk. If this causes an error, skip writing further entries for this logfile.
          if $hget(Logs, $+(%logname, _entry_, %entrycount))
          fwrite -n $asctime($hget(Logs, $+(%logname, _time_, %entrycount)), yyyy-mm-dd HH:nn:ss) $hget(Logs, $+(%logname, _entry_, %entrycount))
          if $ferr {
            goto :writefail
          }

          ;Remove the entry from memory.
          hdel Logs $+(%logname, _entry_, %entrycount)
          hdel Logs $+(%logname, _time_, %entrycount)
        }

        ;Remove the log table from memory.
        hdel Logs $+(%logname, _entries)
        :writefail
      }

      ;Close the file.
      if $fopen(logfile) {
        fclose logfile
      }
    }
  }
}


Would savebuf be a faster alternative, if I would display the logs in a @window, instead of writing them to a hash file? If so, is there a way to "mark" until where the window was saved during the last time, or would I have to use a counter per window?

Last edited by Thels; 03/09/10 12:33 PM.

Learning something new every day.