old data/archive records
GCC Consulting
gcc at optonline.net
Mon May 10 17:08:25 PDT 2004
-----Original Message-----
From: filepro-list-bounces at lists.celestial.com
[mailto:filepro-list-bounces at lists.celestial.com] On Behalf Of Butch Ammon
Sent: Monday, May 10, 2004 2:16 PM
To: filepro-list at lists.celestial.com
Subject: old data/archive records
Good afternoon everyone. It's been a while since I posted... but I'm still
around lurking in the background. Still busy with filePro on SCO Unix here at
Super Radiator Coils in Richmond, VA.
Anyway, here is a question regarding record sizes and how filePro searches and
reads records. We have an archive file that dates back to 1983 and is simply
called "oldorder". Every end month, the current records from the main "order"
file get archived to "oldorder". No biggie... But there are well over 82,000
records in "oldorder" right now. It takes quite a while for filePro to generate
a report, because it has to read all 82,000+ records first. Is there a faster
way to read records in large files? Do you or anyone else have archive files of
archive files or year by year individual files?
I'm just curious... No biggie, no rush... I wonder if an 82,000+ record
database is small compared to what others have!
Thanks for any tips on archiving old records and/or enabling filePro to quickly
search through tens of thousands of records when using dreport with indexes.
Butch Ammon
Super Radiator Coils
Richmond, VA
_______________________________________________
-----------------------------------------------
Butch,
Have a client with a edit order file of over 400,000 records. Using -0 lookups
to print reports or order details takes just seconds. Before I changed to -
lookups, it could take 15 minutes or more to select records for printing.
Once I did this for this file, they had me convert most of the sort/selection
programs to use - lookups where possible.
Richard Kreiss
GCC Consulting
More information about the Filepro-list
mailing list