Jump to content
Claris Engage 2025 - March 25-26 Austin Texas ×

This topic is 3463 days old. Please don't post here. Open a new topic instead.

Recommended Posts

Posted (edited)

I am syncing data from 10 tables from host to an iPad

The tables range from 20 felds up to 190 fields

I have denormalised these tables so that they contain and absolute minim of calculation fields and have removed indexing from those fields that do not need it

The collection of the data to load onto iPad is quick but it then takes several hours to process approx 5k records received as a payload !

I have potential datasets which could reach 15k-20k of records in total !

The iPad(s) are remote and cannot be preloaded as they manage data for tours and the tours change which means that the user has to wipe the device and then go and get the new tours which have been activated for him/her

This processing time is unacceptable for the users and so what should I be looking at to optimise how the data can be processed very quickly when it has been brought down to iPad ?

Cheers

Harry

Edited by Harry Catharell
Posted

Are you preserving any records on the iPad? If not, one option is to run a process that overwrites the old file with a new file. That would be faster.

How large (in MB) is the file once it's been downloaded...how much data is being moved in bits?

Posted

Hi David

 

Thanks for the reply

The file comes in at around 15MB with data in it

I striven hard to keep the footprint down

What do you mean by 'how much data is being moved in bits' ?

Do you mean the payload ?

Cheers

Harry

Posted

An additional question to anyone out there using it in a production environment:

What sort of processing time should I be getting for, say, 10,000 records spread across 10 tables ?

Anyone got anecdotal feedback ?

 

Cheers

Harry

Posted

I don't think there is a simple answer to your questions.

If you are not already doing it, I think you should try using transactional scripting. In my experience that is optimizing performance greatly as well as having a bunch of other advantages. There is also at least 2 things that have impact; Transfer speed of the records and the processing on the device. The transfer speed is related to the connection between the device and the server. This is something you might not be able to control. Processing speed is something you can tweek to a certain degree. However, the amount of records, fields and tables you are outlining, will always require some time. I think it will be important that you make that clear to your users in advance.

Posted

Hi Claus

Thanks for the feedback and advice

I thought that the nature of FMEasySync was transactional when it was processing the incoming data - So is there another aspect to transactional scripting outside of the way that it works now that you can shed some light on, please ?

Cheers

Harry

Posted

EasySync is transactional. What's nice about that is that once the Payload is received, you do not need to have the Host connection. If the iPad goes to sleep, it'll resume processing the payload when it awakes and your data retains its integrity.

Are you including images? That'll bog down the process significantly. Can you narrow down the record set that you are syncing? You mention that "then go and get the new tours which have been activated for him/her" . So this is 20K records? 

We're moving to providing a portal of orders (perhaps you would provide a tour listing) that if the user decides to see detail, just that order and its children are retrieved via sync. The initial list is obtained via a PSOS ESQL call, but we might go to a simple import.

Posted

Hi David

The sync can take place over wifi but they may have to use 3G or 4G if there is no wifi where the user happens to be at that time - It could vary between the two

By 'processing' I mean that the device has successfully received the payload from the server and is then working through 'processing record x of y' to extract the data per table, per field and write it to the file structure on the device

The collection and receipt of the data from the server is not an issue and this works fluidly and in a short time - it is only when the payload has reached the device and it is running through the 'Pull Payload' script portion which then processes the data in that payload that the counter moves very slowly over the records (seconds per records) which creates an inordinate amount of time to write the data into the local file structure

At this point in time, the host file has 5.2k records - To get that data down to a vanilla device on a first sync would be upwards of 5.5 hours !

Hope this helps to clarify a bit more

Cheers

Harry

 

  • 2 weeks later...
Posted

I'm having the same problem and wondered if other users of FM EasySync can shed light on how long (roughly) it actually takes to process a single record..? My solution is taking around 3-4 seconds per record... 

  • 1 month later...
Posted (edited)

I have worked extensively with EasySync and you are correct in your assessment.  My testing indicated that the time to process each record (after the payload has been saved locally on the iOS device) increases with payload size - so say you have a payload with 1000 text only records from one table and one record from another table with a container field.  The addition of the one record with container data will have a significant impact on the processing each of the 1000 text records.

I took several approaches to this problem

1. I added the ability for EasySync to work with subsets of the ES TOs so I can specify a sync to only work with TOs prefixed ESA_ or ESB_ etc  This means I can pull one or more tables independently which reduces the payload size and the number of records each payload contains.  I separate TOs with containers from text only TOs

2. I added the ability to to add custom queries to the Pull phase of a sync which allows me to pull only those records needed on the device in question at that time.

3. I used EasyDeploy to 'upgrade' the solution whenever I needed to 'reload' a large amount of reference data.  This is Much Much faster than a sync.

EasySync is a general purpose sync framework, and there is of course an overhead for the flexibility of referencing everything by name, in addition to that of parsing the significant amount of text in a large payload.

I discussed with Tim adding my enhancements to the official version, but there has never been another release...

I have some further ideas for EasySync and thought about forking the code, but don't have the right now...

In your specific case, if you can't preload and upgrade the file, perhaps EasySync is not the best technique.  In my scenario (with different requirements to yours) I ended up using EasySync for some of the sync and an alternative technique for the rest.

I hope this is of some use to you...

 

Paul Jansen

Edited by Paul Jansen

This topic is 3463 days old. Please don't post here. Open a new topic instead.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.