Indexing a file with 3,000,000 records FAST ?

GCC Consulting gccconsulting at comcast.net
Sat Mar 11 19:01:24 PST 2006


 
_______________________________

	From: filepro-list-bounces at lists.celestial.com
[mailto:filepro-list-bounces at lists.celestial.com] On Behalf Of Mike Fedkiw
	Sent: Friday, March 10, 2006 10:40 AM
	To: filepro-list at lists.celestial.com
	Subject: Indexing a file with 3,000,000 records FAST ?
	
	
	system
	filePro 5.0.09
	windows 2000 advances server
	P4 2.4 GHz
	2 gig of ram
	
	BTW, I don't want to upgrade to 5.0.14. I am waiting for next ver.
Ray said
	that the bug slowing browse window scrolling and flickering when
using lots
	of show commands would be fixed in ver 5.0.15 and above.
	
	I have a file with about three million records. Recently I deleted
some
	records out of this file. Now I am getting node errors on a
particular
	Index. I rebuilt one index this morning and it took over an hour.
now add up
	(a-p times 1 hour). That's 16 hours. There has to be a faster/better
way. I
	did try setting PFNUMIXBUF=15000 and that helped a little. What
about
	PFNUMIXBUILD.
	
	Any help will be greatly appreciated. I am not looking forward to
fifteen
	hours of indexing for one file.
	
	Thanks in Advance, Mike Fedko
	
Mike,

Just a "dumb" question, are you rebuilding these indexes from a workstation
or directly on the server?

Most of the responses I have read look like they are considering a *nix
system.

If you are building from a workstation, create a local server start up
script and rebuild from there.  No network contention.  

Now, if there is a workstation with a very fast processor and fast drives,
you might consider using that machine.  But, I would try Ken's suggestion
and run it from your server directly.

Richard Kreiss
GCC Consulting
 




More information about the Filepro-list mailing list