Index too big error
Jose Lerebours
fpgroups at gmail.com
Tue Sep 6 08:37:32 PDT 2016
On 09/06/2016 11:27 AM, Richard Kreiss via Filepro-list wrote:
> This client has a number of extremely large files
> The largest, just over 21 million and growing (2 of these) - 6.6GB and 3.3 GB
> There are at least 2 files of between 1 and 2 million records - Under a GB in size
Just what kind of application needs to have over 21M records in a
table? Even the IRS allows you to destroy records after 5 years (or is
it 3? - of course, if you are a Clinton, you can destroy them at will
;-) ... ).
I am sure that the problem of size is not due to the number of records
but the size of each record. filePro users often fall in the trap of
"one single table with as many fixed fields to get it done in less than
5 minutes" and rarely employ the practice of "normalized" data structure.
Perhaps, it is time for you to
a) look at the map, split your table and normalize your data
b) purge records and shrink your data segments to minimum possible (hope
you are not using @rn as key)
More information about the Filepro-list
mailing list