View source | Discuss this page | Page history | Printable version   
Toolbox
Main Page
Upload file
What links here
Recent changes
Help

PDF Books
Show collection (0 pages)
Collections help

Search

Retail:Developers Guide/How-to/How to Install, Setup and Run Retail Automated Tests

Contents

Introduction

The Retail Automated tests are are a set of JUnit test cases which use Selenium to execute actions on top of an Openbravo POS session to verify all areas of the application. These tests are executed as part of the Retail Continuous Integration as can be seen here.

Versions used in this guide: Java 7, Tomcat 7, Posgresql 9.3 and oracle 11

Quick start

If you want to start as soon as possible, follow this steps:

1. Create a new directory in your computer

2. Navigate to that directory

3. Donwload eclipse (https://eclipse.org/downloads/) and copy it into that directory

4. Download tomcat (http://tomcat.apache.org/) and copy it into that directory

5. Clone pi-mobile

 hg clone https://code.openbravo.com/tools/automation/pi-mobile

6. Clone the ERP

 hg clone https://code.openbravo.com/erp/devel/pi openbravo

7. Clone the retail repositories in the Navigate to the openbravo/modules directory. (from the ret-test-* repository list that can be found below):

 cd openbravo/modules
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.mobile.core
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.config
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.discounts
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.pack
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.poshwmanager
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.posterminal
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.returns
 hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.sampledata

8. Execute ant.setup and configure the ERP

 cd openbravo
 ant setup

9. Execute setup.py to configure pi-mobile

 cd pi-mobile
 python3 setup.py ../openbravo

10. Run install.source

 cd openbravo
 ant install.source

11. Start the Eclipse that you downloaded in the directory

12. Start the ERP/tomcat. Verify that you can navigate to a terminal

13. Start selenium from the command line. Open a terminal and execute:

 cd pi-mobile/seleniumTools
 sh standalone.sh

14. Run a test

Repositories and continuous integration jobs

Currently there are 2 test repositories :

The pi-mobile repository is used as the main repository for the tests and is used for the main jobs, the ones included in the continuous integration process for Retail. These jobs include:

The same jobs can be found in the try server:

The pi-mobile-sandbox repository, on the other hand, is used for the ret-sandbox-* jobs. The sandbox repository works as a test repository for the tests' modifications, meaning that it is possible to push changes to the tests without affecting the continuous integration cycle. For developers, pi-mobile-sandbox also works as a time-saving strategy to execute the tests after pushing changes to retail modules, that is, sandbox jobs can be executed before the code passes continuous integration.

Bulbgraph.png   ret-test-* and ret-modules-* will not reflect code changes until previous integration steps have finished successfully.

Repositories used in the continuous integration servers

ret-test-*

ret-modules-*

Planned:

About the sampledata used:

The Retail Sampledata repository varies from the test jobs to the modules jobs. Ret-*-test-* jobs use the standard retail sampledata repository (https://code.openbravo.com/erp/pmods/org.openbravo.retail.sampledata) while ret-*-modules-* jobs use a different branch of the repo (https://code.openbravo.com/erp/pmods-branches/org.openbravo.test.mobile.sampledata) together with (https://code.openbravo.com/erp/pmods/org.openbravo.retail.testsampledata).


Bulbgraph.png   org.openbravo.retail.sampledata and org.openbravo.test.mobile.sampledata are branches of the same module (they have the same AD_MODULE_ID) so they MUST NOT be used together.

Setting Up the Automated Test Environment

Assuming we will be working with the pi-mobile repository to execute the tests, the following steps must be followed:

Clone these repositories in your projects directory http://code.openbravo.com/tools/automation/pi-mobile/ Do not place it inside an openbravo directory. The automation makes use of a running ERP, not its sources.

From this point there are 2 ways of configuring the automation context. The script is still not working for Oracle, so you should execute it and then manually fix the database references

1a. The scripted way: - Navigate to the root of the local pi-mobile - Execute the setup.py file pointing to the openbravo directory of the context that will be tested

e.g:

 [openbravo@ManjaroPC pi-mobile]$ python3 setup.py /home/openbravo/clones/tip/openbravo
 
 *** tool to configure Retail automation context v0.3 ***
 
 copying .classpath.template » .classpath
 copying config/OpenbravoERPTest.properties.template » config/OpenbravoERPTest.properties
 copying config/log4j.properties.template » config/log4j.properties
 database = obtip
 new tomcat log path: /tmp/tip/tomcat.log
 last results path: /tmp/tip/
 
 *** end ***

1b. The Manual way:

- Rename/copy the .classpath.template file to .classpath

- Rename/copy config/OpenbravoERPTest.properties.template to config/OpenbravoERPTest.properties

- Rename/copy config/log4j.properties.template to config/log4j.properties

- verify that the config/OpenbravoERPTest.properties has the correct database credentials (bbdd.sid, ...). These credentials must match the values of the openbravo server database.

- In order to run the tests on a Java 7 machine:

Open the .classpath file and find the following line:

 <classpathentry exported="true" kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7"/>

Replace the line with:

 <classpathentry exported="true" kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER"/>

2. Add autodiscovering of the hamcrest imports. This will add intellisense entries when adding assertThat code

-. Add these favorite imports in Window » Preferences » Java » Editor » Content Assist » Favorites

Running Selenium

There are several ways of starting a Selenium server. It can be used locally (within the same machine as the Openbravo server) or on a different computer (remotely).

Local execution of Selenium.

Selenium can be run locally in two different ways. The easiest way is to change to the seleniumTools/ directory and run one of the standalone scripts. If the server is running but an exception is thrown every time a test is run, check if the exception mentions a missing chromedriver. If that’s the case, open the lib/test/ folder, copy the name of one of the chromedrivers there, edit the script used to start selenium and set the correct webdriver.chrome.driver parameter.

A second option to start Selenium locally is to run ant selenium.start from the test repo root. The chromedriver and seleniumdriver versions used will be taken from the OpenbravoERPTest.properties file.

Remote execution of Selenium

Bulbgraph.png   We need 2 test repos for this execution tipe; one on the openbravo server (for the test files) and one on the remote computer (for the selenium files).

In order to execute Selenium in a different computer, we need to use the hub.sh and nodes.sh scripts.

On the remote computer (the one holding Selenium), open the nodes.json configuration file and change the host and hubHost values to 127.0.0.1.

Launch hub.js

Launch nodes.js

Write down the IP of the selenium-holding computer to configure the test repo.

On the local computer (the one with the openbravo server), open the OpenbravoERPTest.properties file and change the selenium.server property to the IP of the remote computer.

Open the ConfigurationProperties.java file, find the following line:

 return openbravoBaseURLTemp.replace(localhostReferenceToChange, ip.getHostAddress());

And change it to

 return openbravoBaseURLTemp.replace(localhostReferenceToChange, "local_ip_as_seen_on_the_lan");

local_ip_as_seen_on_the_lan is the IP of the computer with the openbravo server.

Now when a test is executed the browser will run on the remote machine. In order to run a test, simply right click on it and select Debug As > JUnit Test.

After finishing the tests, remember to terminate the Selenium server (either Ctrl+C the process, or run ant selenium.stop if ant was used to start the server).

Local Offline Execution

Bulbgraph.png   This option is included starting from 3.0PR16Q3

First of all, you need to install and configure different components. You will need to install Apache:

 sudo apt-get install apache2 libapache2-mod-jk

Then you have to edit /etc/apache2/conf-available/openbravo-jk-mount.conf adding this content:

 JkMount /* ajp13_worker
 jkMountCopy all

This will redirect all petitions from Apache to Tomcat, to enable this configuration:

 sudo a2enconf openbravo-jk-mount.conf

Finally, to be able to start/stop Apache from the tests, you need to execute these instructions without typing the root’s password. To make it possible, you have to edit sudoers file in order to grant the access. It can be done by using:

 sudo -su
 echo "openbravo ALL = NOPASSWD: /etc/init.d/apache2" > /etc/sudoers.d/97-ob-apache

Now you can start/stop Apache manually through:

 sudo /etc/init.d/apache2 {start | stop}

After completing all these steps, you only have to edit the OpenbravoERPTest.properties and remove the port from openbravo.url:

 # Openbravo server properties
 openbravo.url=http://localhost/

All the test using offline features need to be included in the class AllowedErrorsHelper inside the method getTestsAllowedToHaveJavascriptErrors and the method getTestsAllowedToHaveTomcatErrors. To be able to start & stop Apache from the test, you have to import the class:

 org.openbravo.test.mobile.core.utils.OBOfflineUtils

This class allows you to interact with Apache making possible to start, stop and check its status. In adittion, the tests class should extend from:

 org.openbravo.test.mobile.retail.pack.selenium.terminals.WebPOSOfflineTerminalHelper

Test Folder Structure

The tests are divided into two main subgroups. Those tests belonging to the package org.openbravo.test.mobile.retail.pack (“pack” tests) are the tests being run on the ret-*-test-* jobs, while the tests belonging to the org.openbravo.test.mobile.retail.extmodules package are the ones executed on ret-*-modules-* jobs.

Adding new tests to the “pack” tests

In order to add a new test to the pack tests, pick the most fitting area (returns, cashup, sales,...) and create a new java file there. Find a test of the same area and copy&paste it for a quick start. As long as the new test is inside an existing category, there is a mechanism that will automatically add that test to the respective suite when the suite is executed.

Adding a new test to the "modules" tests

The modules test are organized by module. Each module has its own suite. Unlike the pack tests, there isn’t a mechanism to add all the tests of a module into the suite; this process needs to be done manually. So for every new test added, it must be added to the corresponding suite manually.

Creating a new test

As already explained, the basic flow to create a test is copy&paste an existing similar test, and, using that as a base, add our test actions. This is done to ensure that the correct test class is extended: pack tests extend WebPOSTerminalHelper, while modules tests extend WebPOSExtModulesTerminalHelper (if they are POS tests) or other different terminals (procurement, mobile warehouse).

The TestId.java holds a mapping between the POS elements (buttons, labels, popups, messages, ...) and a unique name to be used within the tests. Each POS element is identified by a unique idtest[*]. This idtest is obtained on the browser console, using the TestRegistry javascript object. If we open the browser console and execute TestRegistry.appendIdTestToDOM() each element currently existing on the POS will be assigned an idtest property (if applicable). This idtest can be used on the TestId java class to identify the element.

[*]Note that due to some TestRegistry limitations, some elements can’t be assigned a unique idtest and therefore cannot be directly used from the tests.

Using a POS element in our test

Firstly we go to the POS, open the browser console and recover the idtest property of the element we want to use.

Then we go to the TestId class and look for our idtest. If it exists, we can al ready use the TestId.OUR_ELEMENT_NAME. If not, we create it.

Always try to create new TestIds near related TestIds. The syntax to create new TestId elements is: OUR_ELEMENT_NAME(“idtest”, [EnyoKind]).

EnyoKind is a helper class used to identify special elements (buttons, product rows,...). Go to the EnyoKind class to check all the available options.

Actions that can be performed on TestId elements

Currently tap, write, verify and get operations are supported.

Finally, the TestRegistry can also be used to get the enyoObject of an element on the browser console: TestRegistry.registry(‘idtest’).enyoObject

High volume automated test

Starting from RR15Q4, to support high volumes of products, customers and tickets, Openbravo Commerce allows you to work remotely with master data from the WebPOS client. The high volume master data is then not loaded into the WebPOS client database but accesses when needed on a server. The server can be a locally in-store server or a server available in the cloud.

To run a test in remote or high volume mode it is necessary to add the following annotation in the java class test:

 public class HighVolumeTest extends WebPOSTerminalHelper {
 @TestClassAnnotations(isHighVolumeCompatible = true)

When a test is running in high volume mode there is a method called "verifyHgvolTime" which checks that a specific action takes no more than some specific time, and the test fails if it takes too long.

For example, searh a product and check that this action takes no more than some specific time:

 tap(TestId.BUTTON_SEARCH);
 write(TestId.FIELD_SEARCH_TEXT,  TestId.BUTTON_SEARCHPRODUCT_ADHESIVEBODYWARNMERS.getRowName());
 tap(TestId.BUTTON_SEARCH_EXECUTE);
 verifyHgvolTime(TestId.BUTTON_SEARCH_EXECUTE, timeMilliseconds);

Other important topics to consider

The Constants.java file defines several values used all around the test project, such as timeouts or retry values.

Sometimes a test will fail because certain element hasn’t reached the required state. In this cases, increasing the WAIT_DEFAULT_RETRIES might work. Other interesting values to tweak might be:

Although the tests are prepared to be run with the application in development status (there is a test verifying that) they can be executed with the application in production (no modules in development).

If the tests fail because a javascript error regarding the Synchronization Helper and the dropTables process, it means that the dropTables process is taking more than 8 seconds, which is not necessarily wrong. Open the ob-synchronization.js file (in the org.openbravo.mobile.core module), find the following lines:

 OB.UTIL.Debug.execute(function () {
   // the timeout is forced active while in development to catch unbalanced calls and/or adjust the timeoutThreshold
   timeoutThreshold = 8000;
   isTimeoutThresholdActivated = true;
 });

and increase the timeoutThreshold value. Increasing this value will reduce the amount of “false failures” and will only slow the application in case of a real error (which should never happen anyway).

Authentication Managers and the tests

The authentication manager feature is not compatible with the tests. Firstly, it might try to login with wrong credentials (e.g. wrong POS organization). Besides, the first step of the tests is to automate the login, so they will fail if the login has already been done.

Retrieved from "http://wiki.openbravo.com/wiki/Retail:Developers_Guide/How-to/How_to_Install,_Setup_and_Run_Retail_Automated_Tests"

This page has been accessed 4,013 times. This page was last modified on 11 November 2016, at 08:26. Content is available under Creative Commons Attribution-ShareAlike 2.5 Spain License.