huge ascii text files (one line equals one record).
The filter is making simple changes to the input
text line (record) and printing it to a tmp file
and then when finished replacing the original
file (after it was closed of course).
After running for an hour or so it finally
crashes with a message to the effect it is
out of memory in function such and such.
I'm not using any recursive algorithm so
I don't think it is the stack running out of
space. I am running a huge number of
huge files through the filter (using "for"
loops to read the archive directory files).
The filter is making changes to these
archive files and this is why it can run
for hours.
I didn't save the exact error message but
I can start the job again and get the message
verbatim if you need it.