You seem to have missed the point of the idea, if you can read from the end of the file back you dont need the HUGE level of disk accesses that would be required to read from the begining of the file.
An example
Assume the file has 50Mb of data average line length 50 characters (including $crlf) so 1,000,000 lines
you want to read the last 10, so
var %m = $lines(filename), %i = %m - 9 | while (%i <= %m) { echo -a . $read(filename,nt,%i) | inc %i }
ok $lines takes 1,000,000 lines read to get the number of lines, %m is 1000000 and %i is 999991
$read(filename,999991) reads 999991 lines, 999992 reads 999992 lines etc to 1000000 reads 1000000 lines
Total lines read 10,999,955 all to get 10 lines out.
now amagine $endread
var %i = 10 | while (%i) { echo -a . $endread(filename,nt,%i) | dec %i }
$endread(filename,10) read the last 10 lines, 9 last 9 , 8 last 8 etc to 1 last 1
Total lines read 55
10,999,955 vs 55 (well i know which one i would use)
* i dont make any assumptons on how the file is internally read, i know its not read line by line from the disk, but rather in blocks, but the same amount of physical disk accesses would be required as the file is read through to reach the lines requested*