Local and remote EJB performance comparison – PART II

Collecting and analyzing results

In this post are executed EJB performance tests on the technical infrastructure designed in PART I. The aim of these tests is to perform local and remote calls between EJB components and collect information about time spent on these calls. In the beginning is given a brief introduction to a business interface then is elaborated a test scenario. The last part is a result presentation and an attempt to define final conclusion.

Introduction

It is a common sense conclusion that remote calls are more (or eventually equal) time consuming then local calls. Event though, we will take into account  EJB invocations results to observe how significant is a overhead of remote invocation. The main expected overhead factor is a serialization. A network overhead is eliminated because application server instances are running on the same host. There are mentioned many terms and assumptions throughout this text regarding technical architecture and Java EE application on which tests are executed. To get familiar with the execution environment go to [1] and [2].

1. Performance services definitions

There are prepared three main tests types. Each one is responsible for different behavior. They are defined in the BusinessPerformanceService interface (Figure 1). The implementation of the nopService method should be extremely simple. As the name suggests it should do nothing (NOP – no operation). It is interesting to observe a difference between local and remote invocations on a such simple method. The next one –  simpleService should do a simple operation which  does not necessarily mean it will done in a quick time. The code defining a complex operation is expected in a complexService implementation.

1.1. Business service interface

/**
 * Business performance-measurement service interface definition
 */
public interface BusinessPerformanceService {

/**
 * Provide nopService
 */
 void nopService();

/**
 * Provide simpleService
 *
 * @param request
 * @return SimpleResponse
 */
 SimpleResponse simpleService(SimpleRequest request);

/**
 * Provide complexService
 *
 * @param request
 * @return ComplexResponse
 */
 ComplexResponse complexService(ComplexRequest request);
}

Figure 1: BusinessPerformanceService

Default implementation

There is provided a default implementation of BusinessPerformanceService in the com.softexploration.lab.ejb.performance.service.impl. BusinessPerformanceServiceBean class.

The BusinessPerformanceServiceBean implementation method by method overview:

  1. nopService – body of the method is empty.
  2.  simpleService – there is returned only a new SimpleResponse object. However, there is a small surprise inside – SimpleResponse implementation. The class has writeObject(ObjectOutputStream out) and readObject(ObjectInputStream in) definitions which both invoke Thread.sleep for 1/4 second.
  3. complexService – method expects array of byte in request, increment each array’s element by one a then passes the array into response.

1.2. End-user interface

The formal end-user interfaces are: PerformanceServiceExecutorRemoteMXBean and PerformanceServiceExecutorLocalMXBean for remote and local invocations respectively. These two interfaces do not define method whereas they both extend common interface PerformanceServiceExecutor. The interface declares three methods reflecting to methods specified in the BusinessPerformanceService. Implementation of each method of PerformanceServiceExecutor interface has to invoke requestCount times an appropriate method of BusinessPerformanceService.

/**
 * Monitoring load-test end-user interface definition
 * to execute performance services
 */
public interface PerformanceServiceExecutor {

/**
 * execute NopService LoadTest
 *
 * @param requestCount
 * - number of business service executions
 */
 void executeNopServiceLoadTest(long requestCount);

/**
 * execute SimpleService LoadTest
 *
 * @param requestCount
 * - number of business service executions
 */
 void executeSimpleServiceLoadTest(long requestCount);

/**
 * execute ComplexService LoadTest
 *
 * @param requestCount
 * - number of business service executions
 */
 void executeComplexServiceLoadTest(long requestCount);
}

Figure 2: PerformanceServiceExecutor

Default implementation

Interfaces PerformanceServiceExecutorRemoteMXBean and PerformanceServiceExecutorLocalMXBean have implementation in classes: PerformanceServiceExecutorRemote and PerformanceServiceExecutorLocal respectively. Both these classes use internally reference to an instance of SequentialPerformanceServiceExecutor. The SequentialPerformanceServiceExecutor class provides the actual implementation executing in loop business service. There is an additional client layer between business interface and end-user interface layers. More details regarding implementation is available in the source code [1].

2. Tests execution

Tests are executed by methods invocations on manageable beans. The guideline regarding JConsole usage for current purposes is delivered in the Connecting to application suites section in PART I.

Assumptions

  1. There are three test methods: executeNopServiceLoadTest, executeSimpleServiceLoadTest and executeComplexServiceLoadTest for each(i.e. local and remote) manageable bean.
  2. Each method has the same parameter: requestCount which determines the number of business method invocations in a one test.
  3. Business logic implementation is the same for local and remote calls.
  4. Methods listed in point 1 are executed the same number of times with the same parameters values for local and remote invocations.
  5. Test results are stored in log files.

Log files

There are three log files:

  1. com.softexploration.lab.ejb.performance.monitoring.local-results.log – collects local calls results
  2. com.softexploration.lab.ejb.performance.monitoring.remote-results.log – collects remote calls results
  3. com.softexploration.lab.ejb.performance.monitoring.log – collects other data

They are created based on definition in com.softexploration.lab.ejb.performance.monitoring/src/main/resources/log4j.xml and are placed by default in WebLogic domain home catalog.

3. Result analysis

executeNopServiceLoadTest

Figure 3: executeNopServiceLoadTest execution result

executeSimpleServiceLoadTest

Figure 4: executeSimpleServiceLoadTest execution result

executeComplexServiceLoadTest

Figure 5: executeComplexServiceLoadTest execution result

MIN [ns] MAX [ns] MEDIAN [ns] AVG [ns] VARIANCE [ns^2] STDDEV [ns]
executeNopServiceLoadTest local 19633,00 2045484,00 21418,00 26008,52 4483558541,83 66959,38
executeNopServiceLoadTest remote 456040,00 2811201,00 504678,00 545826,37 16736037351,24 129367,84
growth [ns] 436407,00 765717,00 483260,00 519817,85 12252478809,41 62408,46
growth [%] 2222,82 37,43 2256,33 1998,64 273,28 93,20
executeSimpleServiceLoadTest local 20080,00 2792013,00 22311,00 29153,96 7859759103,37 88655,28
executeSimpleServiceLoadTest remote 999672919,00 1068158686,00 1001832412,50 1002076850,65 7854252633632,29 2802543,96
growth 999652839,00 1065366673,00 1001810101,50 1002047696,69 7846392874528,92 2713888,68
growth [%] 4978350,79 38157,65 4490207,08 3437089,97 99829,94 3061,17
executeComplexServiceLoadTest local 2323480,00 30622012,00 2482558,00 2548309,59 848073505809,33 920909,06
executeComplexServiceLoadTest remote 107278557,00 479478456,00 116458913,00 122163071,97 471830603201020,00 21721662,07
growth 104955077,00 448856444,00 113976355,00 119614762,38 470982529695211,00 20800753,01
growth [%] 4517,15 1465,80 4591,09 4693,89 55535,58 2258,72

Table 1: results basic analysis

Table 1 presents processed test results. The complete primal results are attached in [2].

The default unit is time expressed in nano seconds [ns].
In figures default color represents processed value. In turquoise are marked values expressing a growth regarding local and remote calls. A minimal growth in % for each column is marked green and the maximal is red.

Growth is defined as a difference between remote and local values:
growth [ns] = remote call value – local call value

Growth in % is counted in the following manner:
growth [%] = (remote call value/ local call value – 1)*100

For each test differences between remote and local calls are clearly visible in appropriate figures. For executeNopServiceLoadTest the approximate value is 500000, for executeSimpleServiceLoadTest the approximate value is 1000000000 and for executeComplexServiceLoadTest the approximate value is 12000000. The exacts values are put in Table 1. Based on these value we can conclude that service which is doing nothing – empty body in business implementation (executeNopServiceLoadTest) in remote call takes 500000 ns more than in a local one. On the other hand, service which serializes 2MB array of byte requires about 12000000 ns when called remotely. The executeSimpleServiceLoadTest is a special case in which serialization is forced by dummy implementations of readObject and writeObject methods which each invokes thread sleeping for ¼ s. The remote invocation should then take slightly more than 1s (2xread and 2xwrite in serialization process, each ¼ s). The final AVG value is 1002076850,65 [ns] which is the additional 2076850,65 ns to 1s. Is means that the cost of remote call is here higher in comparison to executeNopServiceLoadTest (29153,96 ns).
The greens and reds values show that executeNopServiceLoadTest is the most stable whereas the executeSimpleServiceLoadTest has the widest values spectrum. This is implied mostly by standard deviation. 

Summary

The received results are generally as expected – local calls are more efficient the remote. We observed that remote calls which business service invocation takes by definition long time (1s forced for executeSimpleServiceLoadTest) takes also relatively more time to handle. The execution times for time-consuming serializations are also less stable, this fact is especially expressed by high value of standard deviation. It is worth bear in mind that results can vary on application server and additionally on its configuration. Remote calls can be avoided under certain circumstances. When an implementation theoretically requires remote calls (@Remote annotation or defined in ejb-jar.xml) it can be still avoided by vendor-specific configuration which force local EJB calls when components are within the same JVM or even better in the same EAR.

References

[1] Local and remote EJB performance comparison – PART I

[2] Oracle WebLogic Managed Servers Configuration

Resources

[1] Java EE application sources

[2] Test result logs

Leave a Reply

Your email address will not be published. Required fields are marked *


*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>