Indexing a file with 3,000,000 records FAST ?
Jeff Harrison
jeffaharrison at yahoo.com
Fri Mar 10 11:06:46 PST 2006
--- Kenneth Brody <kenbrody at bestweb.net> wrote:
[snip]
> >From the "technical papers" on fPTech's website:
>
> Tells dxmaint to use 4.1 (multi-pass) style
> index build. Use this
> on HUGE (i.e.: multi-million record) files to
> improve performance -
> (will cause poor performance on smaller files).
>
> In other words, the "old" style of building an index
> was to read a
> chunk of the records, sort that chunk, save it to
> disk, and then
> continue on to the next chunk. When the entire file
> had been read,
> then a second pass was made to merge the pre-sorted
> chunks into the
> final sorted index. (Even older versions of filePro
> would then take
> a third pass to convert this sorted list into
> auto-index format.)
>
> With the "new" style indexes, it is possible to
> build the index in a
> single pass.
>
> However, the single-pass algorithm is slightly less
> efficient than the
> multi-pass method. But, being a single pass, the
> overhead of the
> second pass is completely eliminated. As the number
> of records gets
> into the millions, the loss of efficiency catches up
> with the savings
> of a single pass, and this method becomes less
> efficient overall than
> the old multi-pass method.
>
> [...]
>
In addition to trying PFBIXBUILD=2 you may want to try
to build multiple indexes at the same time.
Jeff Harrison
jeffaharrison at yahoo.com
Author of JHExport and JHImport. The easiest and
fastest ways to generate code for filePro exports and imports.
__________________________________________________
Do You Yahoo!?
Tired of spam? Yahoo! Mail has the best spam protection around
http://mail.yahoo.com
More information about the Filepro-list
mailing list