View source | Discuss this page | Page history | Printable version   

Retail:Developers Guide/How-to/How to Install, Setup and Run Retail Automated Tests OBSOLETE






Bulbgraph.png   This is an OBSOLETE guide, mantained for enyo pos especific topics of selenium tests.

The updated guide is: http://wiki.openbravo.com/wiki/Retail:Developers_Guide/How-to/How_to_Install,_Setup_and_Run_Retail_Automated_Tests







Contents

Manual guide

1. Create a new directory in your computer

2. Navigate to that directory

3. Download Eclipse IDE (https://eclipse.org/downloads/) and copy it into that directory

4. Download Tomcat (http://tomcat.apache.org/) and copy it into that directory

5. Clone pi-mobile

 FIXME: hg clone https://code.openbravo.com/tools/automation/pi-mobile

6. Clone the ERP

 FIXME: hg clone https://code.openbravo.com/erp/devel/pi openbravo

7. Clone the retail repositories in the Navigate to the openbravo/modules directory. (from the ret-test-* repository list that can be found below):

 cd openbravo/modules
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.mobile.core
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.config
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.discounts
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.pack
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.poshwmanager
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.posterminal
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.returns
 FIXME: hg clone https://code.openbravo.com/erp/pmods/org.openbravo.retail.sampledata

8. Execute ant.setup and configure the ERP

 cd openbravo
 ant setup

9. Execute setup.py to configure pi-mobile

 cd pi-mobile
 python3 setup.py ../openbravo

10. Run install.source

 cd openbravo
 ant install.source

11. Start the Eclipse that you downloaded in the directory

12. Start the ERP/tomcat. Verify that you can navigate to a terminal

13. Start selenium from the command line. Open a terminal and execute:

 cd pi-mobile/seleniumTools
 sh standalone.sh

14. Run a test

Repositories and continuous integration jobs

Currently there are 2 test repositories :

The pi-mobile repository is used as the main repository for the tests and is used for the main jobs, the ones included in the continuous integration process for Retail. These jobs include:

The same jobs can be found in the try server:

The pi-mobile-sandbox repository, on the other hand, is used for the ret-sandbox-* jobs. The sandbox repository works as a test repository for the tests' modifications, meaning that it is possible to push changes to the tests without affecting the continuous integration cycle. For developers, pi-mobile-sandbox also works as a time-saving strategy to execute the tests after pushing changes to retail modules, that is, sandbox jobs can be executed before the code passes continuous integration.

Bulbgraph.png   ret-test-* and ret-modules-* will not reflect code changes until previous integration steps have finished successfully.

Repositories used in the continuous integration servers

ret-test-*

ret-modules-*

Planned:

About the sampledata used:

The Retail Sampledata repository varies from the test jobs to the modules jobs. Ret-*-test-* jobs use the standard retail sampledata repository (https://code.openbravo.com/erp/pmods/org.openbravo.retail.sampledata) while ret-*-modules-* jobs use a different branch of the repo (https://code.openbravo.com/erp/pmods-branches/org.openbravo.test.mobile.sampledata) together with (https://code.openbravo.com/erp/pmods/org.openbravo.retail.testsampledata).


Bulbgraph.png   org.openbravo.retail.sampledata and org.openbravo.test.mobile.sampledata are branches of the same module (they have the same AD_MODULE_ID) so they MUST NOT be used together.

Setting Up the Automated Test Environment

Assuming we will be working with the pi-mobile repository to execute the tests, the following steps must be followed:

Clone these repositories in your projects directory https://gitlab.com/openbravo/ci/mobile-test Do not place it inside an openbravo directory. The automation makes use of a running ERP, not its sources.

From this point there are 2 ways of configuring the automation context. The script is still not working for Oracle, so you should execute it and then manually fix the database references

1a. The scripted way: - Navigate to the root of the local pi-mobile - Execute the setup.py file pointing to the openbravo directory of the context that will be tested

e.g:

 [openbravo@ManjaroPC pi-mobile]$ python3 setup.py /home/openbravo/clones/tip/openbravo
 
 *** tool to configure Retail automation context v0.3 ***
 
 copying .classpath.template » .classpath
 copying config/OpenbravoERPTest.properties.template » config/OpenbravoERPTest.properties
 copying config/log4j.properties.template » config/log4j.properties
 database = obtip
 new tomcat log path: /tmp/tip/tomcat.log
 last results path: /tmp/tip/
 
 *** end ***

1b. The Manual way:

- Rename/copy the .classpath.template file to .classpath

- Rename/copy config/OpenbravoERPTest.properties.template to config/OpenbravoERPTest.properties

- Rename/copy config/log4j.properties.template to config/log4j.properties

- verify that the config/OpenbravoERPTest.properties has the correct database credentials (bbdd.sid, ...). These credentials must match the values of the openbravo server database.

- In order to run the tests on a Java 7 machine:

Open the .classpath file and find the following line:

 <classpathentry exported="true" kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER/org.eclipse.jdt.internal.debug.ui.launcher.StandardVMType/JavaSE-1.7"/>

Replace the line with:

 <classpathentry exported="true" kind="con" path="org.eclipse.jdt.launching.JRE_CONTAINER"/>

2. Add autodiscovering of the hamcrest imports. This will add intellisense entries when adding assertThat code

-. Add these favorite imports in Window » Preferences » Java » Editor » Content Assist » Favorites

Running Selenium

There are several ways of starting a Selenium server. It can be used locally (within the same machine as the Openbravo server) or on a different computer (remotely).

Local execution of Selenium.

Selenium can be run locally in two different ways. The easiest way is to change to the seleniumTools/ directory and run one of the standalone scripts. If the server is running but an exception is thrown every time a test is run, check if the exception mentions a missing chromedriver. If that’s the case, open the lib/test/ folder, copy the name of one of the chromedrivers there, edit the script used to start selenium and set the correct webdriver.chrome.driver parameter.

A second option to start Selenium locally is to run ant selenium.start from the test repo root. The chromedriver and seleniumdriver versions used will be taken from the OpenbravoERPTest.properties file.


Bulbgraph.png   In a modules environment it is required to remove the following file

modules/org.openbravo.certification.france/src/org/openbravo/certification/france/ticket/TicketInitialValidation.java

Remote execution of Selenium

Bulbgraph.png   We need 2 test repos for this execution tipe; one on the openbravo server (for the test files) and one on the remote computer (for the selenium files).

In order to execute Selenium in a different computer, we need to use the hub.sh and nodes.sh scripts.

On the remote computer (the one holding Selenium), open the nodes.json configuration file and change the host and hubHost values to 127.0.0.1.

Launch hub.js

Launch nodes.js

Write down the IP of the selenium-holding computer to configure the test repo.

On the local computer (the one with the openbravo server), open the OpenbravoERPTest.properties file and change the selenium.server property to the IP of the remote computer.

Open the ConfigurationProperties.java file, find the following line:

 return openbravoBaseURLTemp.replace(localhostReferenceToChange, ip.getHostAddress());

And change it to

 return openbravoBaseURLTemp.replace(localhostReferenceToChange, "local_ip_as_seen_on_the_lan");

local_ip_as_seen_on_the_lan is the IP of the computer with the openbravo server.

Now when a test is executed the browser will run on the remote machine. In order to run a test, simply right click on it and select Debug As > JUnit Test.

After finishing the tests, remember to terminate the Selenium server (either Ctrl+C the process, or run ant selenium.stop if ant was used to start the server).

Local Offline Execution

Setup offline execution

Bulbgraph.png   This option is included starting from 3.0PR16Q3

First of all, you need to install and configure different components. You will need to install Apache:

 sudo apt-get install apache2 libapache2-mod-jk

Then you have to edit or create /etc/apache2/conf-available/openbravo-jk-mount.conf adding this content:

 JkMount /* ajp13_worker
 jkMountCopy all

This will redirect all petitions from Apache to Tomcat, to enable this configuration:

 sudo a2enconf openbravo-jk-mount.conf

Finally, to be able to start/stop Apache from the tests, you need to execute these instructions without typing the root’s password. To make it possible, you have to edit sudoers file in order to grant the access. It can be done by using:

 sudo su -
 echo "openbravo ALL = NOPASSWD: /etc/init.d/apache2" > /etc/sudoers.d/97-ob-apache
 #Note: this command works for openbravo user, if you have a different user, replace it in previous line.

Now you can start/stop Apache manually through:

 sudo /etc/init.d/apache2 {start | stop}

Note: It should not ask for password.

Configuration and how to use offline

Note: If using the scripts no needed to do this part

After completing all these steps, you only have to edit the OpenbravoERPTest.properties and remove the port from openbravo.url:

 # Openbravo server properties
 openbravo.url=http://localhost/

All the test using offline features need to be included in the class AllowedErrorsHelper inside the method getTestsAllowedToHaveJavascriptErrors and the method getTestsAllowedToHaveTomcatErrors.

In the tests you can use these methods to go online and offline:

goOffline();
goOnline();

Note: this methods are defined in WebPOSTerminalHelper so should be available in all terminals.

Trick to test quick updates

Prerequisite to have configured the offline execution with apache.

To move quickly between two workspaces, for example when testing the update from one version to the other.

A trick could be create a script to change the port that uses mod-jk to connect apache to tomcat, for these needed to do these steps:

alias s1='sudo sed -i /etc/libapache2-mod-jk/workers.properties -e '\s/8109/8009/'\ && sudo /etc/init.d/apache2 restart'
alias s2='sudo sed -i /etc/libapache2-mod-jk/workers.properties -e '\s/8009/8109/'\ && sudo /etc/init.d/apache2 restart'

With this you can switch to workspace using these commands:

s1  // configure apache to use the first tomcat
s2  // configure apache to use the second tomcat

The application will be allways available at http://localhost/openbravo (no need to specify port, it will use default one :80 in which apache is listening)

Test Folder Structure

The tests are divided into two main subgroups. Those tests belonging to the package org.openbravo.test.mobile.retail.pack (“pack” tests) are the tests being run on the ret-*-test-* jobs, while the tests belonging to the org.openbravo.test.mobile.retail.extmodules package are the ones executed on ret-*-modules-* jobs.

Adding new tests to the “pack” tests

In order to add a new test to the pack tests, pick the most fitting area (returns, cashup, sales,...) and create a new java file there. Find a test of the same area and copy&paste it for a quick start. As long as the new test is inside an existing category, there is a mechanism that will automatically add that test to the respective suite when the suite is executed.

Adding a new test to the "modules" tests

The modules test are organized by module. Each module has its own suite. Unlike the pack tests, there isn’t a mechanism to add all the tests of a module into the suite; this process needs to be done manually. So for every new test added, it must be added to the corresponding suite manually.

Creating a new test

As already explained, the basic flow to create a test is copy&paste an existing similar test, and, using that as a base, add our test actions. This is done to ensure that the correct test class is extended: pack tests extend WebPOSTerminalHelper, while modules tests extend WebPOSExtModulesTerminalHelper (if they are POS tests) or other different terminals (procurement, mobile warehouse).

The TestId.java holds a mapping between the POS elements (buttons, labels, popups, messages, ...) and a unique name to be used within the tests. Each POS element is identified by a unique idtest[*]. This idtest is obtained on the browser console, using the TestRegistry javascript object. If we open the browser console and execute TestRegistry.appendIdTestToDOM() each element currently existing on the POS will be assigned an idtest property (if applicable). This idtest can be used on the TestId java class to identify the element.

[*]Note that due to some TestRegistry limitations, some elements can’t be assigned a unique idtest and therefore cannot be directly used from the tests.

Using a POS element in our test

Firstly we go to the POS, open the browser console and recover the idtest property of the element we want to use.

Then we go to the TestId class and look for our idtest. If it exists, we can al ready use the TestId.OUR_ELEMENT_NAME. If not, we create it.

Always try to create new TestIds near related TestIds. The syntax to create new TestId elements is: OUR_ELEMENT_NAME(“idtest”, [EnyoKind]).

EnyoKind is a helper class used to identify special elements (buttons, product rows,...). Go to the EnyoKind class to check all the available options.

Fixing an Unstable test

There are some tests that fails randomly and they are marked as "unstable" in the Test Status sheet(Google Spreadsheet). Follow these steps to fix an unstable test:

For any suggestion/help go directly to retail team.

Actions that can be performed on TestId elements

Currently tap, write, verify and get operations are supported.

Finally, the TestRegistry can also be used to get the enyoObject of an element on the browser console: TestRegistry.registry(‘idtest’).enyoObject

High volume automated test

Starting from RR15Q4, to support high volumes of products, customers and tickets, Openbravo Commerce allows you to work remotely with master data from the WebPOS client. The high volume master data is then not loaded into the WebPOS client database but accesses when needed on a server. The server can be a locally in-store server or a server available in the cloud.

To run a test in remote or high volume mode it is necessary to add the following annotation in the java class test:

 public class HighVolumeTest extends WebPOSTerminalHelper {
 @TestClassAnnotations(isHighVolumeCompatible = true)

When a test is running in high volume mode there is a method called "verifyHgvolTime" which checks that a specific action takes no more than some specific time, and the test fails if it takes too long. "verifyHgvolTime" function is called inside each "tap" action.

Hybrid remote automated test

Starting from RR18Q4, tests can activate some of high volumes preference to automate some scenarios which are not exactly a "pure" high volumes test.

To run a test with hybrid remote configuration it is necessary to implement in each java test class:

The preference defined for the test are set after the first login and are unset after the end of the test. If the testClassAnnotation is present but the activateRemoteModels method is not overrided or the array of remote model is null or is an empty array, the test will fail and will not be executed.

Other important topics to consider

The Constants.java file defines several values used all around the test project, such as timeouts or retry values.

Sometimes a test will fail because certain element hasn’t reached the required state. In this cases, increasing the WAIT_DEFAULT_RETRIES might work. Other interesting values to tweak might be:

Although the tests are prepared to be run with the application in development status (there is a test verifying that) they can be executed with the application in production (no modules in development).

If the tests fail because a javascript error regarding the Synchronization Helper and the dropTables process, it means that the dropTables process is taking more than 8 seconds, which is not necessarily wrong. Open the ob-synchronization.js file (in the org.openbravo.mobile.core module), find the following lines:

 OB.UTIL.Debug.execute(function () {
   // the timeout is forced active while in development to catch unbalanced calls and/or adjust the timeoutThreshold
   timeoutThreshold = 8000;
   isTimeoutThresholdActivated = true;
 });

and increase the timeoutThreshold value. Increasing this value will reduce the amount of “false failures” and will only slow the application in case of a real error (which should never happen anyway).

Authentication Managers and the tests

The authentication manager feature is not compatible with the tests. Firstly, it might try to login with wrong credentials (e.g. wrong POS organization). Besides, the first step of the tests is to automate the login, so they will fail if the login has already been done.

Throttle and the tests

After throttle changes random tests fail in try-retail, not able to fix all of them, we opted to disable the throttle in all ci servers.

This is just a workaround and plan is to enable again as soon as possible.

For the moment, to make a try-retail run with the throttle enabled is needed to use an extra param ACTIVATE_THROTTLE=true in the push a try-retail.

For more details this is the issue: https://issues.openbravo.com/view.php?id=37416

Retrieved from "http://wiki.openbravo.com/wiki/Retail:Developers_Guide/How-to/How_to_Install,_Setup_and_Run_Retail_Automated_Tests_OBSOLETE"

This page has been accessed 79 times. This page was last modified on 29 May 2023, at 09:31. Content is available under Creative Commons Attribution-ShareAlike 2.5 Spain License.