Tokenizing

Kenneth Brody kenbrody at spamcop.net
Thu Jun 30 07:36:12 PDT 2011


On 6/28/2011 6:21 PM, Richard Kreiss wrote:
> This is probably an Ken question.
>
> Why is it recommended that one tokenize processing tables with the auto
> processing table that will be used with it?

If you are using filePro's ability to have multiple copies of dummy fields 
(one for each break level), and you are using that ability to get subtotals 
(for example, a simple "xx=xx+3" to total field 3, and then placing "xx" on 
different break levels in the output format), and the field you are using to 
do this totaling is defined in "regular" automatic, but not in the 
processing you will be using as automatic processing in rreport, the totals 
will not be correct if you compile the output processing with "regular" 
automatic, rather than the one you run with.

-- 
Kenneth Brody


More information about the Filepro-list mailing list