OPENDIR number of files problem
Bruce Easton
bruce at stn.com
Wed Oct 8 15:55:18 PDT 2008
John Esak wrote Wednesday, October 08, 2008 6:12 PM:
>
> Yes, this is really strange. Ken and I worked this out on my system last
> week. You can get around the opendir truncation to 3 digits by just using
> @dirlist_rfilename["0"] (or any of the @dirlist variants) as the correct
> number of opened files.
>
> The *fact* that something can lie dormant like this for a decade... then
> hits me... and within the same couple days a week it hits someones else.
> Absolutely amazing. Who had this problem after I found it a couple weeks
> ago? Was it you Bruce?
>
[..]
>
> John
It really is weird, John - we experience something similar on certain days
here - and we wonder - do the clients and vendors have a club that we don't
know about? :):). I don't believe it was me a couple of weeks ago, but then
my memory is getting bad.:) No, today, Marcia was working on this RFID
thing and I had to look into why she wasn't getting as many records
processed as she expected. I wrote the code she was using, but when I had
tested it a while back - it was only against a handful of records - when I
looked in the directory being read, I saw she had over 7000 files in there
that needed processing (can you say *ARRRG!!* [list too long]:)) But I
worked around it by changing the code to write out the list of files with ls
instead of using opendir at all - fortunately this particular program is
intended as single-user.
I should say for client systems where fp files are involved or remotely
involved I can only recall one other case where I had to work with one or
more single directories with the number of files in the thousands. And even
in this case, this is a temporary situation - the intention is for the
directory to hold only a day's data, not several weeks' data.
Bruce
Bruce Easton
STN, Inc.
More information about the Filepro-list
mailing list