Openbravo POS Integration/zh cn
In an organization, more than one software application often exists to support a particular aspect of operational requirements. The result is an heterogeneous set of software applications requiring data sharing and integration. An ERP system like Openbravo ERP tries to solve this problem by offering one solution for every operational requirement. Openbravo ERP maintains one shared database for all data, integrated processes among different departments, a consistent interface for every user, and homogeneous reports/score cards that display operational data of the whole organization.
在一个组织中，经常存在一个以上的软件应用用来支持某种特殊的操作环境。结果就是不同类型的软件应用构成了一个集合，它们要求数据共享和集成。一个ERP系统，比如Openbravo ERP试图解决这个问题，通过提供一种解决方案以适用于每一种操作环境的需求。Openbravo ERP维护一个共享的数据库，服务于所有的数据、不同部门间完整的工作流程、针对所有用户的一致的接口，以及显示整个组织操作数据的相同的报告/得分卡片。
However there are specific corners of an organization that an ERP system cannot solve. For example, POS systems like Openbravo POS where special user types and hardware devices support are required.
The POS system user is a salesman who uses an ERP in a different manner. The interface has to be very easy to use and provide only the specific information the salesman needs. This salesman needs to operate the POS as fast as possible as his job is to sell, not to operate the POS. For example he does not want to deal with a mouse and a keyboard, rather he prefers a touch screen. In addition, the POS system needs to support a lot of POS hardware available for the real solution: receipt printers, barcode scanners, customer displays, cash drawers, scales, etc.
The focus of this integration is to create a system where Openbravo ERP is the central repository of the data: Products, customers, taxes, orders... And Openbravo POS has the ability to operate with the master data downloaded from Openbravo ERP and to upload orders created by the sales activity of Openbravo POS.
Openbravo ERP REST Webservice操纵这OpenBravo ERP的业务对象。在OpenBravo ERP中，业务对象可以是一个简单的实体(数据库表)，如货币，一个简单的字段。他们也可以复杂到一系列试题的集合，订单头，以及一系列订单行。
Openbravo ERP REST WebService提供下列功能：
- 用标准HTTP GET请求获取单个业务对象或一系列业务对象。
- 用XML文件或HTTP POST/PUT操作更新一个或多个现存的业务对象。
- Delete operation using either an URL pointing to a specific business object which needs to be removed or an XML document which contains business objects (一个完整的XML或部分XML文件) which need to be removed.
Openbravo ERP REST Web service采用和应用程序相同的权限机制标准。任何调用Web Service前，调用者都必须登录并确保有相应权限。Openbravo ERP REST框架提供了登录功能。所有REST操作都在当前用户/组织的环境之下，受限于此用户的权限。
集成环境中，所有产品信息，仓库信息，产品品类，税务和客户信息都在Openbravo ERP中维护。这就要求Openbravo POS和Openbravo ERP有很好的信息同步机制。
每个Openbravo POS的订单都必须上传到Openbravo ERP系统以备进一步的处理。
注意: 模块化只在Openbravo ERP 2.50或更高版本
An Extension Module is a piece of additional functionality that can be optionally and independently deployed on top of Openbravo ERP.
The experience of the user deploying modules is similar to the one of Firefox plugins: you are able to browse a catalog of modules, install (and uninstall), deploy and upgrade them directly from within the Openbravo ERP Administration UI.
A developer of an Extension Module can package and deliver a module independently from the rest of the Openbravo ERP content, meaning that is possible to package modules with a delivery mechanism that only includes files and metadata that they are responsible for.
The functionality of the REST Webservices is very useful for standard integration scenarios and in our case allow us to get desired Business Objects from Openbravo ERP and update an existing Business Object.
The idea is to install a module (与其他Openbravo ERP模块独立) which is the responsible to deploy a REST Webservice in charge of the synchronization.
- 以系统管理员登录Openbravo ERP.
- 进入General Setup > Application > Module Management.
- 进入Add modules页。
在列表里选择POS Synchronization WebService模块并点击现在安装（Install now）键。
从Sourceforge.net: OpenbravoPOS: Files下载模块的最新包。这里有描述所有文件的内容：Openbravo_POS_2.30_Release_notes#How_to_get_Openbravo_POS.
点击Browse File System键并走到org.openbravo.service.pos.obx模块。 The module file can be downloaded from the Openbravo POS project page.
- In the next window more details about the module are available. To show them click on view details... link. To continue process press Continue button.
- 下一步是接受许可(必须接受才能往下走)。选择I accept all license agreements框再按Continue键。
- Then, you are redirected to Installed Modules tab. There you can see all the installed modules and their current status. After installing a module the application must be rebuild to complete the installation. So click on the rebuild now link with yellow background or in Changes pending, rebuild now link near the POS Synchronization WebService module.
- In the next window press Yes button to confirm that you want to rebuild the application.
- The rebuild will start automatically showing in the window all the log. This process can take some minutes depending on your computer, please be patient.
- Finally, when the process finish it will notify the result obtained. Select Reload the Openbravo application now radio button and press the Continue键。等待系统重启。
To check if the module is installed correctly, point your browser to:
Note that this example assumes that Openbravo ERP runs locally on port 8080, it maybe necessary to replace the localhost:8080 part with your own server name/port.
The result should be the list of all Business Objects (entityName) inside Openbravo ERP.
如果得到类似<error><message>No registration for name org.openbravo.service.pos.syncWs</message></error>的信息，重启Tomcat试试。
In Openbravo ERP edit the external point of sales to define the products that will be available for every point-of-sale. With this, when a point of sale tries to get the product catalog from Openbravo POS, the products defined in this window for this point of sale are sent. And the options defined are used in the process of orders imported.
To edit the external point of sale you have to change the role to the Openbravo ERP entity administrator of the entity you are working with, and open the menu option Sales management > Setup > External Point of Sales.
In this window you define your point of sales and include/exclude products and product categories associated with your point of sale.
You can select the products set to synchronize by category or by product. To select the product by category choose All Selected in the Included Product Categories selector and add the category records wanted. To select the products list by product, choose All Selected in the the Included Products selector and add the product records wanted.
If you point the browser to this URL a welcome page will appear showing general information about the usage of the Web Service.
- 用DAL获取关于Openbravo ERP业务对象的信息。
- 把Openbravo ERP业务对象转化为Openbravo POS业务对象。
- 用XML的形式展示Openbravo POS业务对象内容。
- erp.id: Openbravo ERP内部客户端编号，外部POS会使用。
- erp.org: Openbravo ERP内部使用的POS编号，编码。
- erp.pos: 在Openbravo ERP使用的外部POS编号，编码。
- erp.user: Openbravo ERP用户，用于触发集成功能。
- erp.password: Openbravo ERP用户的密码。
重要: The internal client and organization identifiers determine the accessibility to the Business Objects. Only the BO matching with the defined client and organization identifier will be synchronized.
|存储（Inventory）||STOCKCURRENT, STOCKDIARY, STOCKLEVEL||m_storage_detail|
- The product must belong to the recent (higher validfrom date) Price List Version inside the Price List chosen in the External Point of Sale.
Openbravo ERP # Openbravo POS | # | |--- Warehouse # |--- Warehouse |-----| # | |--- Storage Bin # |---Products | # |--- Products #
When the stock of a product is synchronized, the sum of the stock available in the Storage Bins of the selected Warehouse is done.
Main Warehouse | |--- First Storage Bin | | | |--- Hat = 50 Units | |--- Second Storage Bin | |--- Hat = 150 Units
In Openbravo POS we will have 200 units of Hat in STOCKCURRENT table.
Pentaho Data Integration is a free, open source (LGPL) ETL (Extraction, Transformation and Loading) tool provided by Pentaho. Pentaho Data Integration delivers powerful Extraction, Transformation and Loading (ETL) capabilities using an innovative, metadata-driven approach. Using this would allow to create Web Services, get data directly from a database and store data directly in a database.
The Pentaho Data Integration suite consists of the following applications:
- Spoon: is by far the most used component of the Pentaho Data Integration suite. It is a graphical tool build on the Eclipse platform, that allows for the visual configuration of ETL jobs and transformation.
- Carte: is a simple web server that allows you to remotely access the Pentaho Data Integration repository, and execute, monitor, start and stop jobs and transformations that run on the server.
- Pan: allows you execute transformations designed by Spoon, thus allowing your ETL jobs to be run from the command line.
- Kitchen: is a program that can execute jobs designed by Spoon, allowing them to be scheduled and run in batch mode.
Pentaho Data Integration is a very flexible and friendly tool which provides a very intuitive graphic tool (Spoon) to modify and create our transformations. For the end user is very easy to use and it don't required any special programming skills. Furthermore, there is an active community behind the project, that means forum support, complete documentation and so on.
The current architecture reduces the effort required to modify an existing functionality and minimizes the impact to implement a new functionality. With the old synchronization each time when some new data needs to be synchronized a developer must add a suitable piece of code on both sides, so either in POS and in ERP. The functionality of the REST Webservices is very useful because they allow to retrieve information from all Business Objects of Openbravo ERP just doing an HTTP request. So, if more data is required, the modifications will just affect the Pentaho Data Integration environment.
Other applications can integrate with Openbravo POS through Pentaho Data Integration in a simple way: creating a new transformation or job. They don't need to modify Openbravo POS code, Pentaho Data Integration provides enough mechanism to achieve a successful synchronization between external application and Openbravo POS.
Pentaho Data Integration provides the possibility to schedule transformations or jobs to run in a remote server or force the synchronization after outstanding event happens in Openbravo POS side, for example, when cash is closed. This point is important, because is not needed a specific user interaction each time you need to make the integration run, just prepare the schedule in a suitable time and run once.
The newest stable version of Pentaho Data Integration is 3.1.0-826 so the file you have to download is pdi-open-3.1.0-826.zip. To install it simply unpack the zip file into a folder of your choice.
Pentaho Data Integration requires the Sun Java Runtime Environment (JRE) version 1.5 or newer.
All JDBC drivers provided in Pentaho Data Integration are located into pdi-open-3.1.0-826/libext/JDBC.
Is mandatory to use in Pentaho Data Integration a JDBC driver version compatible with the version of the JDBC driver used in Openbravo POS.
Suggestion:export the JDBC driver of the database used in Openbravo POS to pdi-open-3.1.0-826/libext/JDBC folder of PDI (Kettle) and replace it.
The particular case of Pentaho Data Integration 3.1.0-826 and Openbravo POS 2.30 the JDBC drivers provided by Pentaho Data Integration are:
- derby.jar (10.2.2.0)
- hsqldb.jar (1.8.0)
- ojdbc14.jar (10.2.0.2.0)
The Derby JDBC driver provided in Pentaho Data Integration pdi-open-3.1.0-826/libext/JDBC/derby.jar is the version 10.2.2.0 and is not compatible with the Derby JDBC driver version (10.4.2.0) used in Openbravo POS.
So is needed to replace the the derby.jar (10.2.2.0) located in pdi-open-3.1.0-826/libext/JDBC/derby.jar with the newest derby.jar.
You can check the version of the .jar unpacking it and looking in the /META-INF/MANIFEST.MF file.
The transformation and job files are the processes needed by Pentaho Data integration that contains all the logic and steps of all the data synchronization processes between Openbravo ERP 和Openbravo POS.
下载最新版，从Sourceforge.net: OpenbravoPOS: 文件这里。 这里由所有可下载文件的描述；Openbravo_POS_2.30_Release_notes#How_to_get_Openbravo_POS。
Spoon is the integration tool we are going to use to run transformations and jobs. Spoon is also the graphical tool with which you design and test every Pentaho Data Integration process. The other Pentaho Data Integration components execute the processes designed with Spoon.
To start Spoon in Windows execute the file spoon.bat just double clicking on it or launch it from the command line. You can also create a shortcut to this file and place it in your preferred location: the desktop, the launch bar...
To start Spoon in Linux or other Unix-like operating systems, you will need to make the shell scripts executable by using the chmod command:
cd pdi-open-3.1.0-826 chmod +x spoon.sh sh spoon.sh
As soon as Spoon starts, a dialog window appears asking for the repository connection data. For our purposes, we choose to just use the PDI without specifying a repository. Press the No repository button. Is possible to disable this window for the next time unchecking Present this dialog at startup checkbox.
The next thing you'll see is a Spoon tips window. Disable unchecking Show tips at startup? checkbox.
Then, the Main page screen of Spoon will appear.
The top portion of the screen contains the menu options that allow you to perform a whole host of customisations and operations of the application. The left hand side panel of the screen contains two different views:
- View: describe all the elements (steps, hops, jobs, database connection...) used.
- Design: the operations that can be used on an everyday basis within the Pentaho Data Integration.
In the main area transformations or jobs are opened in tabs. Each file has a menu which allows to execute useful operations like: run, debug, explore database...
If you are interested investigating this tool deeply take a look at the complete User Guide if you want more details about it.
Pan is a command line script that can execute transformations designed by Spoon in XML or in a database repository. Usually transformations are scheduled in batch mode to be run automatically at regular intervals.
To run a transformation from a file using Windows execute.
bat Pan.bat /file:"folder_path/transformation.ktr" /level:Basic
And using Linux and other Unix-like operating system execute.
cd pdi-open-3.1.0-826 chmod +x pan.sh sh pan.sh -file="folder_path/transformation.ktr" -level=Minimal
For more information read Pan User Guide.
Kitchen is a command line script that can execute jobs designed by Spoon in XML or in a database repository. Usually jobs are scheduled in batch mode to be run automatically at regular intervals.
To run a job from a file using windows execute.
bat Kitchen.bat /file:"folder_path/job.kjb" /level:Basic
And using Linux and other Unix-like operating system execute.
cd pdi-open-3.1.0-826 chmod +x kitchen.sh sh kitchen.sh -file="folder_path/job.kjb" -level=Minimal
For more information read Kitchen User Guide.
C:\Documents and Settings\<username>\.kettle\kettle.properties
# 数据库连接 db.URL = jdbc:derby:/home/openbravo/openbravopos-database;create=true db.driver = org.apache.derby.jdbc.ClientDriver db.user = mikel db.password = mikel # Openbravo ERP erp.URL = http://localhost:8080/openbravo/ws/org.openbravo.service.pos.syncWs erp.id = 1000000 erp.org = 1000000 erp.pos = 1234 erp.user = Openbravo erp.password = openbravo
- db.URL: Is the connection string that uses JDBC to localize the JDBC the driver and the database.
- db.driver: Is the Java class name that implements the JDBC driver. This name is also defined by the database engine vendor.
- db.user: The name of the authorized database user.
- db.password: The password of the database user.
- erp.URL: The base URL location where the Openbravo ERP Webservice endpoint is installed.
- erp.user: The Openbravo ERP user used to invoke the integration functionality.
- erp.password: The password of the Openbravo ERP user.
- erp.id: The Openbravo ERP internal client identifier of the external point of sale.
- erp.org: The Openbravo ERP internal organization identifier of the external point of sale.
- erp.pos: The search key of the external point of sale defined in Openbravo ERP used to identify the Openbravo POS system inside Openbravo ERP.
Remember that the Openbravo REST web services use the same access/authorizations as the standard Openbravo application. All REST actions are then executed in the context of a client/organization and current role of the user.
After setting all parameters is possible to check if Spoon can access and read them and if the database connection is properly configured.
To check the variables open in Spoon one of the transformations. In the top menu go to Edit > Show Environment Variables (Ctrl+L) and the variables and their values should appear in the dialog. If the Variables doesnt showup, ensure that you have included Kettle_home in your environment variables, point it to the directory that has the .kettle folder.
例如： Kettle_home = C:\Documents and Settings\<username>
To check the database connection open in Spoon one of the transformations. In the top menu go to Edit > Explore DB and select openbravoposDB database to explore. If the connection is configured properly a database explorer will appear showing database tables.
Open RUN SYNCHRONIZATION.kjb job in Spoon and run it by clicking on the run button from the main menu, toolbar or by pressing F9.
A window will appear, just click on Launch button.
The synchronization process will start automatically, you can see a brief status description of each transformation or job in Job metrics tab.
For a detailed log about the process go to Logging tab.
Open ORDERS.ktr transformation in Spoon and run it by clicking on the run button from the main menu, toolbar or by pressing F9.
A window will appear, just click on Launch button.
After orders have been synchronized, to process the orders imported you have to change the role to the Openbravo ERP entity administrator of the entity you are working with, go to the menu option Master Data Management > Import Data > Import Orders and execute the process Import Orders. When the process ends a dialog with the result of the process is shown to the user.
Carte is a simple web server that allows you to execute transformations and jobs remotely. It does so by accepting XML (using a small servlet) that contains the transformation to execute and the execution configuration. It also allows you to remotely monitor, start and stop the transformations and jobs that run on the Carte server. A server that is running Carte is called a Slave Server in the Pentaho Data Integration terminology.
Is interesting to provide authomatic synchronization between Openbravo ERP and Openbravo POS without user interaction. Each job in Pentaho Data Integration can be scheduled to run in a remote slave server. Carte User Documentation
Carte accepts 2 command line options:
- The IP address or host name to run on. If you have a machine with multiple network cards you can choose here which interface to run on.
- The HTTP port to listen to (defaults to port 80).
To launch Carte in Windows execute Carte.bat file from the command line specifying the <ip> and <port>:
bat Carte.bat <ip> <port>
To launch Carte in Linux or other Unix-like operating systems, you will need to make the shell scripts executable by using the chmod command:
cd pdi-open-3.1.0-826 chmod +x carte.sh
And run specifying the <ip> and <port>:
sh carte.sh <ip> <port>
Point your browser to http://<ip>:<port>/ and authentification dialog will appear.
The default user and password to use to gain control is cluster.
You can change either of these in the file pwd/kettle.pwd in plain text
From version 3.1 on you can also put this password file in $HOME/.kettle/ or $KETTLE_HOME/.kettle/
It is possible to obfuscate the password in the kettle.pwd file. There is a tool called Encr in the distribution that allows you to generate passwords that are obfuscated.
To obfuscate the password in Windows execute:
bat Encr.bat -carte test_password OBF:1vv31vn61xtv1zlo1unf1y0s1ri71y0y1uoj1zlu1xtn1vnw1vu7
To obfuscate the password in Linux and other Unix-like operating systems execute:
sh encr.sh -carte test_password OBF:1vv31vn61xtv1zlo1unf1y0s1ri71y0y1uoj1zlu1xtn1vnw1vu7
The string "OBF:1vv31vn61xtv1zlo1unf1y0s1ri71y0y1uoj1zlu1xtn1vnw1vu7" can then be copied into the kettle.pwd file instead of the clear-text password.
It is possible to make Carte use a Java Authentication and Authorization Service (JAAS). To do this, define an environment variable called "loginmodulename" as well as the "java.security.auth.login.config" property. Carte will pick these up to use these authentication settings.
Open in Spoon the job you want to schedule and run in a remote server. In the left panel, select view tree control and expand all nodes. Double click on Slave server node or right-clicking on 'Slave Server' and selecting the New option. Fill the parameters of the configuration dialog according to the parameters defined to Carte.
- Service tab
- Server name: the friendly name of the server.
- Hostname or IP address: the address of the machine.
- Port: defines the port you wish to use for communicating with the remote server.
- Username: username credential for accessing the remote server.
- Password: password credential for accessing the remote server.
- Is the master: This setting tells Pentaho Data Integration that this server will act as the master server in any clustered executions of the transformation.
- Proxy tab
- Proxy server hostname: sets the hostname for the Proxy server you are connecting through.
- The proxy server port: sets the port number used in communication with the proxy.
- Ignore proxy for hosts: Specify the server(s) for which the proxy should not be active. This option supports specifying multiple servers using regular expressions. You can also add multiple servers and expressions separated by the ' | ' character.
- Configure scheduling
- double click on START step of the job
and configure desired scheduling to run the job.
Open the job you want to execute remotely in Spoon and run it by clicking on the run button from the main menu, toolbar or by pressing F9.
Select Execute remotely radio button and in the list bellow the desired Remote host. Press Launch button to start.
Point your browser to http://<ip>:<port>/ and login. Click on Show status link and you will access to Carte server administration window (Status page). Here you can see all of the transformation and jobs available and their current status:
- Waiting: is loaded in the Carte server and ready to be launched.
- Running: is running in the server.
- Stopped: is stopped.
You can access to each tranformation or job to manage them and see more details.
com.openbravo.basic.BasicException: Cannot connect to database. Database not available. org.apache.derby.impl.jdbc.EmbedSQLException: Failed to start database '$HOME/openbravopos-database'