If the historical behavior is desired for backwards compatibility, at least people can find it by searching for /play or /dcc send since the forum's search tool is useless for trying to search for ..\ so I have trouble seeing if this has been addressed before.

Other than navigating through an FSERVE window, which the last time I checked did defend against someone escaping the sandbox, there doesn't seem a legitimate purpose for any of the disk identifiers supporting paths containing ..\ for the same reasons in my $zip bug report a while ago.

I recently helped someone with a script that unknowingly had such a vulnerability with the /play command, though it could easily be a problem for /dcc send or even /write $read etc.

When a script tries to limit interaction with files as only those within a specific folder tree by having a base foldername as a hard-coded string or a %variable, someone can still use single or multiple ..\ to escape the confines of the folder 'sandbox' in order to read or write a filename that they happen to know the exact path\filename for it.

Sometimes it's intentional that people be able to request a relative path\file to avoid needing to churn the disk looking for a file somewhere in a folder tree below the base folder, or to differentiate which samefilename.filetype you're wanting, so it's not as simple as just using $nopath() against the outsider-created string. An option to apply dcc ignore settings to file sends too wouldn't help defend against message with .txt files.

--

Because of the way Windows and MSDOS have historically treated \ and / as equivalents within a path, any script trying to block this stuff would need to defend against both.

--

Another related exploit might require some scripts, in addition to checking for ../ and ..\ is to disallow driveletter followed by a colon followed by not-a-slash. Since there's no /cd command to change the default folder of various drive letters, there's no way to change the active directory for any drive letter away from the root folder.

Because the original MSDOS functions returns pathnames without the leading slash and the root folder is a null string, when the root folder is the active folder, d:file.txt and d:\file.txt are equivalent, as are d:path\file.txt and d:\path\file.txt

This 2nd issue isn't any more of a problem than accepting as input /dcc send $nick $2- without any validation of any path or no-path filename, but anyone defending a flash drive or other driveletter by checking for paths containing driveletters needs to check for wildmatch ?:* not ?:\* or ?:/*

It doesn't appear to be a problem for $zip, which correctly doesn't want to extract c:path names.

--

Example:

!ftp_files ..\users\owner\documents\filename.txt
!ftp_files ..\users\known_username\appdata\roaming\mirc\client.pem

bad:
if ($isfile(c:\foldername\ $+ $2-)) dcc send $nick $qt($2-)
fix: precede dcc command with:
if (..\ isin $replace($2-,/,\)) goto error

If they don't know where the base %folder is located, they could then just stack multiple ..\ together until it works.

There doesn't seem to be an easy way to convert one of these indirect paths to the true canonical filename if it contained the ..\ so you can't use "if (c:\ftp_folder\* iswm $longfn(%filename))" to ensure that the variable points to a file in a subfolder in or beneath c:\ftp_folder\ because someone could always include 1 or more subfolders before using several double-dots to escape back out of them. Even using $longfn or $shortfn or $file().longfn doesn't get rid of the double dots. The usefulness of getting rid of the .\ and ..\ in paths is when a script wants to make sure that there aren't 2 different ways of having something like /copy using 2 signficantly different strings to refer to the same filename, such as where the destination is equivalent to the short-fn of the long-fn.

The only time I see .. without ..\ being a problem is when it's allowed to be the folder parm in $finddir or $findfile, where if the client is installed into c:\mirc\ then this shows the files in the root folder:

//echo -a $findfile(..,*.*,0,1,echo -ag $1-)

Using . as a path should be fine, such as used for $sdir $sfile, and having the doubledot is sadly more common in non-paths than it should be, image..jpg

Update:
To clarify, we were trying several ways to get the actual pathname, but all of these return the string containing the doubledots.

//var -s %a c:..\ $+ $gettok($mircdir,-1,92) $+ \mirc.ini | echo -a $longfn(%a) vs $file(%a).longfn vs $nofile(%a) vs $findfile($nofile(%a),mirc.ini,1) vs $file(%a).path $+ $file(%a).longfn

The function for the true canonical filename I referred to, it goes back to the MSDOS world, with Interrupt 21H function AH=60 where you can feed it any string with/without path, absolute or relative, and it returns the full c:\path\file.ext or \\place\file.ext


Last edited by maroon; 29/07/22 12:23 AM.