View source | Discuss this page | Page history | Printable version   
Main Page
Upload file
What links here
Recent changes

PDF Books
Add page
Show collection (0 pages)
Collections help


Reporting Server/Generating Data Loader App


The update and load scripts are created during development. At runtime the scripts run within a specific application, called the data loader application. The architecture of the reporting solution in production is shown below.

Openbravo Reporting Server Dev Architecture.png

The data loader application runs on the reporting server and is responsible for updating the reporting database/datawarehouse.

Data Loader Application

The data loader application is a standard java server side application which is controlled through ant commands.

The data loader app is generated from the definition in the application dictionary. The main difference between running the individual scripts from the application dictionary and the data loader app is that the data loader app runs the scripts in a certain order. This is defined by the script order field of the load script definition. The data loader app will run scripts and logic in parallel if possible, depending on the number of processors.

Generating the data loader application

The data loader application is generated from the build/development environment. The generation needs a correctly set file. The generation is done with the following ant command which needs to be executed from within the modules/ folder from the build environment.

ant generate.dataloader.application -DtargetDir=[TARGET_DIR]

Replace the [TARGET_DIR] with the relevant target folder.

After generating check the log4j2.xml file (copy it from the log4j2.xml.template file) and configure the log appropriately.

Data Loader Application Content

When you check the content of the data loader application you will see a folder structure like shown below.

Openbravo data loader app folder.png

A short description for the main files: the dataloader application itself, if you have custom load classes then these should also be included in this jar file.

{"steps": [
    "id": "2D670B9F28304A65B1063B42D2961C10",
    "name": "bp_group",
    "identifier": "obrtmd_bp_group",
    "tableName": "obrtmd_bp_group",
    "scriptFileName": "obrtmd_bp_group.sql",
    "className": "",
    "modulePrefix": "OBRTMD",
    "reloadOrder": 20,
    "materializedView": false,
    "alwaysFullyReload": true,
    "executeOnce": false
    "id": "AE40DC98475D408A8761936703B543E5",
    "name": "channel",
    "identifier": "obrtmd_channel",
    "tableName": "obrtmd_channel",
    "scriptFileName": "obrtmd_channel.sql",
    "className": "",
    "modulePrefix": "OBRTMD",
    "reloadOrder": 20,
    "materializedView": false,
    "alwaysFullyReload": true,
    "executeOnce": false

Running the Data Loader Application

To run the data loader application execute the following ant command from within the data loader application folder:


Typically the data loader application runs through the cron_tab. For this usage the data loader application provides the update_script file which can be included in the crontab. For more details check out this wiki page on the server configuration.

Running Individual Load Steps

It is also possible to run individual load steps. For this you need to know the identifier of the load step. You can find this in the data-load-definition.config file, the json field is called identifier. For example to load/run only the BP_Category script do the following action:

ant -Dstep=obrtmd_bp_group

Retrieved from ""

This page has been accessed 6,700 times. This page was last modified on 11 December 2019, at 09:57. Content is available under Creative Commons Attribution-ShareAlike 2.5 Spain License.