Import question go

Mike Dawson mikedawson at bellsouth.net
Thu Mar 11 06:16:38 PST 2021


A former fP programmer of 25 years says....Why not make it dynamic?

In the IoT world of MQTT we broadcast large chunks of variable defined data
in JSON bundles.  They can be encrypted bundles at that.
One of the methods I developed (Python & MySQL)  is to build the JSON bundle
dynamically and then save the entire bundle on the 
receiving end in a SQL BLOB.  This way you can transfer large volumes of
data, (compessed, encrypted, whatever) and then extract and store it in
a remote SQL BLOB after tranmission.  As a use case:
Each transmission refreshes the last known value in the remote BLOB field
SQL record so you always have the last known values available.

I believe that fP has BLOB ability and you could apply the same method.
This way you completely avoid static field definitions and even
have the field variable names defined in the JSON bundle for each variable
and value.  All you have to know for display purposes is the origination
field name.

Use Case Process:

In Build Order
1. Export the JSON array
2. Compress
3. Store in fP BLOB field.

In Recover Order
1. Locate fP Record by Index Identifier
2. Read BLOB FIELD,  decompress, store to array.
3. Parse Recovered Array from BLOB and query array.

Hope this helps.....and removes the burden of static field definition
limitions in fP

<MD>


-----Original Message-----
From: Filepro-list
[mailto:filepro-list-bounces+mikedawson=bellsouth.net at lists.celestial.com]
On Behalf Of scooter6--- via Filepro-list
Sent: Wednesday, March 10, 2021 3:58 PM
To: filePro Mailing List
Subject: Import question

So I have an import routine that imports a csv of some parsed XML
It's pretty detailed and I use a ton of variables
My question is, let's suppose I have a field that has over a 100 fields
within it, separated by a ^

For example, if I'm importing a huge list of amounts, the field I need to
breakdown can have 100 or more that are

100.00^21.20^50.00^62.50^33.12..........etc

Is there an easier way than using strtok to parse large fields like this?

I originally didn't think there would ever be more than 40, but some of my
brilliant clients who don't always know what they're doing in our
integrated partner's software, have sometimes managed to have 100 or more -
so trying to see if there is a better/simpler way to do this

Otherwise, I'm going to spit it off into its own file and create a lookup
screen on the records - which I can do, but didn't really want to go that
route if not necessary

I currently parse the incoming XML with xsltproc and create a csv file and
that all works well - I'm trying to come up with possibly a better way to
get that into my fp databases

thanks for any input

Scott
PDM
-------------- next part --------------
An HTML attachment was scrubbed...
URL:
<http://mailman.celestial.com/pipermail/filepro-list/attachments/20210310/73
55e9f7/attachment.html>
_______________________________________________
Filepro-list mailing list
Filepro-list at lists.celestial.com
Subscribe/Unsubscribe/Subscription Changes
http://mailman.celestial.com/mailman/listinfo/filepro-list



More information about the Filepro-list mailing list