number of records answer...sort of :)

Bill Vermillion fp at wjv.com
Mon Dec 20 22:07:02 PST 2004


On Tue, Dec 21 00:05 , Fairlight moved his mouse, rebooted for the 
change to take effect, and then said:" 

> Well, I created an alien file on SCO OSR 5.0.6 of just single
> line Y records. I got a file that was 1999999998 records long
> before I hit the 2GB file limitation. This is with a single
> character field with a single character between records, where
> fP would have a 20-byte header betore each record.

> After an alien overlay, I got the following from dclerk:
> 
>          Enter Record Number (1-1999999998):

> Problems:

>      1) You can only enter 8 digits.  You can only go to the 99,999,999th 
>         record via IUA "Select by Record Number", and @rn apparently is 
>         limited to 8 places as well, as you found.  You can enter up to the
>         eight 9's, and then down arrow and go manually, and it works--and
>         even displays the correct record number in the bottom-right corner.
>         You just can't get there manually or use @rn to reference them.
> 
>      2) You'll never be able generate more than that number of records even
>         with one character per field, one field per record, on a 32bit fs
>         that suffers from the 2gig file limit.  
>                int(2000000000 / 21) = 95,238,095


Actually the 2GB file size is a bit larger than that.  Use your
handy dandy  binary/oct/hex/dec calculator you have in you Windows
system and you'll fine that 2^31/21 = 102,261,126 + a fraction.

So it's larger than than 95,283,095 but if the limit in the field
is 99,999,999 then the largers file you can access with that
would be 2,099,999,979 bytes.

> So basically it doesn't make a bit of difference on 2gig
> file limited versions (or at least platforms, assuming fP is
> adjusted for UnixWare, AIX, Solaris and such. You can't create
> more than that many files, and that's a totally useless file
> due to the lack of distinctive data. Not being able to select
> or discriminate data past the @rn limitation is superceded by
> the 2gig file limit currently in force.

But you should be able to split the key file into multiple
segments, each under 2GB long shouldn't you?   I believe you can
have the original key and 3 more extents.

> Personally, I think if you have over a hundred million records,
> you probably need to be on something enterprise-grade like DB2,
> for various other considerations besides this stuff.

Do you dislike Ellison and all he stands for :-)

Bill
 
-- 
Bill Vermillion - bv @ wjv . com


More information about the Filepro-list mailing list