OPENDIR number of files problem

John Esak john at valar.com
Wed Oct 8 16:16:10 PDT 2008


Wow! And I'm always complaining about our filepro folders with 2,800 files
in them!  But 7,000!  Wow!

For everyone who doesn't know.  The file systems used on Unix and Windows
are not good (in retrieval speed, etc.) with folders that have more than 999
files in them.  No errors really, just terrible performance.... and not that
I know of any file systems that would do this better.  Who knows? Maybe some
main frame file systems can do better with over 1,000 files... but, boy, you
should see how slowly our very fast servers open a directory with 2,800
files, and create files in directories that start with zero files and
eventually have 350 folders created in them with some of them havding 2,500+
file in them.  Some messag boxes show the "slowness".  A lot of it is
network bandwidth, but tha fact that is appears to get slower as the number
of files in each directory gets higher... proves out that the old timey
consnesus that over 999 is not a desirable thing to do.  But, simply put,
what do you do when your filePro app is actaually used by hundreds of users
all creating their own selection sets, browse formats, etc.  It's very soon
and you have hunbdreds and hundreds and even thousands of files in your
filePro folder.  Just a fact of life.

All I can say is Thank God, Ken had the workaround of the sub-zero element
of @dirlist holding (and returning) the correct number.  It was crucial to
my program(s) working correctly.

John



> -----Original Message-----
> From: filepro-list-bounces+john=valar.com at lists.celestial.com
> [mailto:filepro-list-bounces+john=valar.com at lists.celestial.com] On Behalf
> Of Bruce Easton
> Sent: Wednesday, October 08, 2008 6:55 PM
> To: filepro list
> Subject: RE: OPENDIR number of files problem
> 
> John Esak wrote Wednesday, October 08, 2008 6:12 PM:
> >
> > Yes, this is really strange.  Ken and I worked this out on my system
> last
> > week.  You can get around the opendir truncation to 3 digits by just
> using
> > @dirlist_rfilename["0"] (or any of the @dirlist variants) as the correct
> > number of opened files.
> >
> > The *fact* that something can lie dormant like this for a decade... then
> > hits me... and within the same couple days a week it hits someones else.
> > Absolutely amazing.  Who had this problem after I found it a couple
> weeks
> > ago?  Was it you Bruce?
> >
> [..]
> >
> > John
> 
> It really is weird, John - we experience something similar on certain days
> here - and we wonder  - do the clients and vendors have a club that we
> don't
> know about? :):).  I don't believe it was me a couple of weeks ago, but
> then
> my memory is getting bad.:)  No, today, Marcia was working on this RFID
> thing and I had to look into why she wasn't getting as many records
> processed as she expected.  I wrote the code she was using, but when I had
> tested it a while back - it was only against a handful of records - when I
> looked in the directory being read, I saw she had over 7000 files in there
> that needed processing (can you say *ARRRG!!* [list too long]:))  But I
> worked around it by changing the code to write out the list of files with
> ls
> instead of using opendir at all - fortunately this particular program is
> intended as single-user.
> 
> I should say for client systems where fp files are involved or remotely
> involved I can only recall one other case where I had to work with one or
> more single directories with the number of files in the thousands.  And
> even
> in this case, this is a temporary situation - the intention is for the
> directory to hold only a day's data, not several weeks' data.
> 
> Bruce
> 
> Bruce Easton
> STN, Inc.
> 
> 
> 
> _______________________________________________
> Filepro-list mailing list
> Filepro-list at lists.celestial.com
> http://mailman.celestial.com/mailman/listinfo/filepro-list



More information about the Filepro-list mailing list