Too Many Open Files (was Re: The CLOSE command)
Fairlight
fairlite at fairlite.com
Fri Feb 17 15:09:43 PST 2006
Yo, homey, in case you don' be listenin', John Esak done said:
>
> As to this, should you want to be even more precise about open lookups, (in
> many instances this is not very critical), you need to close lookups that
> failed as well as those that don't.
I see too many programmers that don't even check for a failed lookup.
Notably, people just don't check for free record lookup failures. I
questioned one about the lack of a sanity check and was told it would never
happen. I said they'd think differently if they ran out of disk space.
The response was that if that happened they had far larger problems than
failing to check for a failed free record lookup. I suppose I can't argue
with that response, but it's still poor form, IMHO.
Now, to me, good programming -mandates- that you check for a possible error
condition when one is possible, and handle things accordingly. But that's
just the way I was taught, perhaps--well, and my obsessive/compulsive
nature maybe. But I was told it doesn't matter how unlikely the error is
to occur, you check for it anyway. Skipping simple bounds and error checks
is what leads to things like buffer overruns in other languages--arguably
one of the biggest stability and security scourges in the industry. filePro
is thankfully more forgiving, but it's no excuse for propogating lax coding
style.
But it doesn't surprise me that someone would fail to close lookups. A lot
of times people don't even check to see if they succeed, much less be
bothered to close them.
Me personally? I not only close them, I do a write on them beforehand
if I've changed any data. I've been told by a few people that this is
superfluous, but I prefer knowing that it's -absolutely-, -explicitly-
written with no room for doubt or interpretation.
mark->
More information about the Filepro-list
mailing list