How to Add Data Synchronization Tasks
Contents |
Overview
In the Web POS (and in mobile technology in general), developers can add data synchronization tasks in a modular way. These tasks will automatically send to the backend the contents of a model, so that a process in the backend can then read it and save the appropriate information in the database.
There is a single-entry point which is in charge of calling all data synchronization tasks. The tasks are called in a specific order, and if the synchronization of one fails, the subsequent ones are not called. This single entry point function is called automatically by all the actions in the Web POS which create data which needs to be synchronized (such as completing a ticket, saving a new business partner, or finalizing the cash up), but this function can also be called directly by modules which need it.
New data synchronization tasks automatically inherit most infrastructure features of the standard Web POS data synchronization tasks, such as proper context handling of the data, and error management.
Definition of a Data Synchronization Task
There are two main steps necessary to define a new data synchronization task:
- Define the data synchronization task in the client side, and add the task to the list of tasks in the proper position.
- Define the backend process responsible of managing the data received from the client, and add it to the list reference of available entities.
Definition of the task in the client side
A new task is defined by adding a plain Javascript object to the "dataSyncModels" array of the standard terminal object. This plain JS object has the following properties:
- model: the OB.Model class which corresponds to the model which contains the data that will be managed by this task
- className: The classname of the Java class in the backend which will manage the data for this task. This class must extend the POSDataSynchronizationProcess class.
- criteria (optional): This is a Javascript criteria object which will be added to the query executed to read the data from the model. This can be used in case the task should synchronize only part of the table (for example, already processed records instead of all of them). If criteria is not informed, the clause "hasBeenProcessed=Y" is set by default.
- isPersistent (optional, default false): If this property is set to true, then the records will be kept in the local database instead of being deleted after they have been sent to the backend.
- postProcessingFunction (optional): this is an optional function which will be executed after the records have been sent to the backend. This function will receive two parameters:
- data: a collection which contains the data which has just been sent to the backend
- callback: a function which should be executed after the postProcessingFunction has finished its work. If this function is not executed, the synchronization flow would be interrupted, so it's very important to execute it in all relevant circumstances.
Definition of the task in the backend side
There are two steps needed in this part. First, a new Java class must be created. This Java class must extend the POSDataSynchronizationProcess class, must define the "@DataSynchronization" annotation specifying the entity which is going to synchronize, and must also implement the saveRecord method (which receives a JSONObject as a parameter, in which the data for a single record will be stored).
This class must implement whatever logic necessary to synchronize a single record in the database, in the saveRecord method. No other logic is necessary, as the mobile infrastructure will automatically go through all the records received, set the appropriate context for them, and call the saveRecord method for each.
If an error happens when processing the record, the saveRecord should throw an Exception with a readable message. The mobile infrastructure will automatically capture the exception, and save the record in the "Errors" table of the Web POS, similarly to what happens in the standard Web POS synchronization tasks (such as ticket synchronization). The user will be able to see the error in the Errors while importing POS data, pretty much like what happens with the standard ticket synchronization.
Secondly, a new entry must be added in the List Reference called Types of data to be synchronized in Web POS. This entry must be created for the module the developer has added the new class to, and must have a search key which matches the name of the entity which is being synchronized. This search key also must be the value of the "entity" property added in the annotation of the Java class.
This list reference is used in the "Errors while importing POS data" window.
Example of a Data Synchronization Task
To illustrate how a synchronization task is defined, we are going to use an example. We are going to implement a very simple loyalty program module. This module will store loyalty transactions associated to a business partner every time a ticket is created. These loyalty transactions will then be synchronized to the backend, and a tab in the business partner window will show them.
Creating the model and then adding the sync task
First we need to create a model in the client and add it in a sync task:
var modelObj = { modelName: 'LOYAL_LoyaltyPoints', tableName: 'LOYAL_LoyaltyPoints', entityName: 'LOYAL_LoyaltyPoints', source: '', local: true }; var LoyaltyPoints = OB.Data.ExtensibleModel.extend(modelObj); LoyaltyPoints.addProperties([{ name: 'id', column: 'id', primaryKey: true, type: 'TEXT' }, { name: 'amount', column: 'amount', type: 'NUMERIC' }, { name: 'c_bpartner_id', column: 'c_bpartner_id', type: 'TEXT' }]); OB.Data.Registry.registerModel(LoyaltyPoints); OB.MobileApp.model.get('dataSyncModels').push({ model: OB.Model.LOYAL_LoyaltyPoints, className: 'org.openbravo.retail.loyaltyExample.LoyaltyLoader' });
This code will create a new model in the client side, with its associated local table. We would need to add code to the appropriate hooks to fill this table every time the user completes a new ticket. You can use hooks to do this.
As we have added the model as a sync task by pushingn an object to the dataSyncModels array, now every time a synchronization needs to be done, after all the orders, customers, cash ups, have been sent to the backend, the process will automatically read the entries of our LOYAL_LoyaltyPoints local table, and send them to the backend, by doing an ajax call to the LoyaltyLoader Java class.
Now we are going to show how this Java class can be implemented.
Creating the processing class in the backend
The class in the backend can be created this way:
@DataSynchronization(entity = "LOYAL_LoyaltyPoints") public class LoyaltyLoader extends POSDataSynchronizationProcess { public JSONObject saveRecord(JSONObject json) throws Exception { LOYAL_LoyaltyPoints lp = OBProvider.getInstance().get(LOYAL_LoyaltyPoints.class); lp.setBusinessPartner(OBDal.getInstance().get(BusinessPartner.class, json.getString("c_bpartner_id"))); lp.setAmount(json.getDouble("amount")); final JSONObject jsonResponse = new JSONObject(); jsonResponse.put(JsonConstants.RESPONSE_STATUS, JsonConstants.RPCREQUEST_STATUS_SUCCESS); jsonResponse.put("result", "0"); return jsonResponse; } }
Adding the list reference
Finally, we need to add a reference for the entity we have created. For this, we go to the reference window in the ERP logged as system administrator:
Using the Data Import Entry Synchronization Process - Handling High Volume Data Sync
If your business case involves high volume updates in unpredictable load situations it can make sense to synchronize your data using the data import entry framework.
Note the import entry framework can be configured using [[1]]. One of the properties allows to completely disable the asynchronous processing and let the processing be done completely synchronous when the request arrives (not advised for high load situations).
To make use of this framework you have to implement the same artifacts as described in the previous section. In addition the following has to be done:
- let the loader class implement a specific interface
- implement a processing controller which controls the loading in the backend
If your data is synced through the data import entry approach it will go through a 2 stage process:
- when the request is received from the client (on the server) then first a record is created in the data import entry table.
- next there is a background process which can use multiple threads to process these records/data.
This approach has to be used high volume environments (100+ requests per minute).
Let's discuss how to implement the above 2 classes.
The Loader Class
The loader class has to be implemented in the same way as described in the previous section:
@DataSynchronization(entity = "BusinessPartner") public class CustomerLoader extends POSDataSynchronizationProcess implements DataSynchronizationImportProcess { protected String getImportQualifier() { return "BusinessPartner"; }
The only difference with the previous section is:
- you need to implement the DataSynchronizationImportProcess interface, and
- implement a getImportQualifier method returning a unique value, so best to use the module prefix as part of the name. You can for example use the value/searchkey of the reference added in the previous section.
The value of the import qualifier is used when processing the import record.
Note: another use of the import entry framework is in the Multi-Server_Process_Calls_Concept. For that usecase there is no need to implement the DataSynchronizationImportProcess, but you should inherit from the MultiServerDataSynchronizationProcess class.
The ImportEntryProcessor Class
The thread processing class requires some more specifics but if you extend an existing class it only requires a few lines of code.
The import entry processor class consists of the following parts:
- a CDI qualifier using the same value as returned from the getImportQualifier method
- an inner class which returns the Loader class created in the previous section. This inner class can extend the MobileImportEntryProcessorRunnable class.
- a method to identify if an import entry can be handled by this implementation. Checking the type of data of the import entry (see below) is often enough, check against the value returned from the getImportEntryQualifier method (see previous section).
- a key which helps the data import entry framework to determine which entries can be processed in parallel or which entries are processed sequentially. For example if entries of the same store should be processed sequentially but entries of different stores can be done in parallel then return the organization of the entry as the key (entries with the same key are processed sequentially)
The best is to look at an example, that makes things more clear:
@ImportEntryQualifier(entity = "BusinessPartner") @ApplicationScoped public class CustomerImportEntryProcessor extends ImportEntryProcessor { protected ImportEntryProcessRunnable createImportEntryProcessRunnable() { return WeldUtils.getInstanceFromStaticBeanManager(BusinessPartnerRunnable.class); } protected boolean canHandleImportEntry(ImportEntry importEntryInformation) { return "BusinessPartner".equals(importEntryInformation.getTypeofdata()); } protected String getProcessSelectionKey(ImportEntry importEntry) { return (String) DalUtil.getId(importEntry.getOrganization()); } private static class BusinessPartnerRunnable extends MobileImportEntryProcessorRunnable { protected Class<? extends DataSynchronizationProcess> getDataSynchronizationClass() { return CustomerLoader.class; } } }
The above implementation show the parts which need to be added:
- extend the default implementation: ImportEntryProcessor
- implement the createImportEntryProcessRunnable returning an instance of the inner class using Weld.
- the inner class which returns the class which does the actual loading of the entry (note from the 16Q3 release you should override the getJSONProcessorClass method in this inner class).
- a method to identify if an import entry can be handled by the processor. Here you can also add some checking code if an entry can be processed, all the info is there
- a method to help decide which entries can be processed in parallel and which should be done sequentially.
Import Entry Post Processing
To add actions/code which is executed after an import entry was processed you can use the ImportEntryPostProcessor class.
See an example below. As you can see you need to set the ImportEntryQualifier annotation, the entity attribute should be set to the type of data of the import entry you want to post-process.
Note:
- the post processor is called after the import entry has been processed and committed to the database.
- if the post processor does any database actions then it needs to commit/rollback before returning the afterProcessing method.
@ImportEntryQualifier(entity = "Order") public class POSImportEntryPostProcessor extends ImportEntryPostProcessor { @Override public void afterProcessing(ImportEntry importEntry) { System.err.println("After processing " + importEntry.getTypeofdata()); } }
Current Data Synchronization Tasks
The standard data synchronization tasks are:
- ChangedBusinessPartners: newly created business partners
- ChangedBPlocation: created addresses for business partners
- Order: new tickets
- CashManagement: new cash management movements
- CashUp:new cash ups
These are the contents of the data synchronization tasks by default. You can add your tasks at the end of the array, or at an earlier position if it makes sense for your entity to be synchronized before any of the standard ones.