old data/archive records
Butch Ammon
butch at rich.srcoils.com
Mon May 10 11:16:29 PDT 2004
Good afternoon everyone. It's been a while since I posted... but I'm still
around lurking in the background. Still busy with filePro on SCO Unix here at
Super Radiator Coils in Richmond, VA.
Anyway, here is a question regarding record sizes and how filePro searches and
reads records. We have an archive file that dates back to 1983 and is simply
called "oldorder". Every end month, the current records from the main "order"
file get archived to "oldorder". No biggie... But there are well over 82,000
records in "oldorder" right now. It takes quite a while for filePro to generate
a report, because it has to read all 82,000+ records first. Is there a faster
way to read records in large files? Do you or anyone else have archive files of
archive files or year by year individual files?
I'm just curious... No biggie, no rush... I wonder if an 82,000+ record
database is small compared to what others have!
Thanks for any tips on archiving old records and/or enabling filePro to quickly
search through tens of thousands of records when using dreport with indexes.
Butch Ammon
Super Radiator Coils
Richmond, VA
More information about the Filepro-list
mailing list