old data/archive records

Jean-Pierre A. Radley appl at jpr.com
Mon May 10 12:06:07 PDT 2004


Butch Ammon propounded (on Mon, May 10, 2004 at 02:16:29PM -0400):

| Good afternoon everyone.  It's been a while since I posted... but I'm
| still around lurking in the background.  Still busy with filePro on
| SCO Unix here at Super Radiator Coils in Richmond, VA.
|
| Anyway, here is a question regarding record sizes and how filePro
| searches and reads records.  We have an archive file that dates back
| to 1983 and is simply called "oldorder".  Every end month, the current
| records from the main "order" file get archived to "oldorder".  No
| biggie...  But there are well over 82,000 records in "oldorder" right
| now.  It takes quite a while for filePro to generate a report, because
| it has to read all 82,000+ records first.  Is there a faster way to
| read records in large files?  Do you or anyone else have archive files
| of archive files or year by year individual files?
|
| I'm just curious... No biggie, no rush...  I wonder if an 82,000+
| record database is small compared to what others have!

Nothing unsual about that size even for an active file, let alone an
archive.

| Thanks for any tips on archiving old records and/or enabling filePro
| to quickly search through tens of thousands of records when using
| dreport with indexes.

What's slow?  Aren't you using the classic sort-select method with
'lookup -' to find only the records you need, then quit searching?

-- 
JP


More information about the Filepro-list mailing list