A.1 Test Strategy

The test setup represents a medium-sized business with heavy traffic to help predict performance for both smaller and larger implementations. The performance, reliability, and scalability tests cover the critical areas that you need to know for designing your system.

A sizing guide is included to help determine the number of users that can be supported on a specific number of servers and configuration.

The tests cover the following major functional areas of public access, authentication, and authorization:

  • The public requests test is focused on Access Gateway as a reverse proxy with caching to help increase the speed of your web servers by eliminating any authentication and authorization policy overhead.

  • The authentication requests test is focused on the distributed architecture that provides a secure login to Access Manager.

  • The authorization requests test is focused on the policy evaluation that occurs after the login has been completed and before the page is accessed.

The test environment includes a cluster of four Identity Servers and four Access Gateways. The number of users and the amount of traffic determine the size of the cluster.

A.1.1 Performance, Reliability, Scalability, and Failover Testing for Access Gateway

The performance testing includes the following scenarios:

  • HTTP traffic through a public resource

  • HTTPS traffic through a public resource

  • HTTPS traffic through a protected resource

  • HTTPS traffic through a protected resource with Form Fill

  • HTTPS traffic through a protected resource with Identity Injection

  • HTTPS traffic through a protected resource with policies that contain roles

  • HTTPS traffic through a protected resource with 10 additional page requests

The reliability testing includes the HTTPS traffic for 2 weeks through a stress test scenario

The scalability (clustering) testing includes the following scenarios:

  • 2 x 4 x 4 (2 Administration Console servers, 4 Identity Server servers, and 4 Linux Access Gateway servers)

  • 2 x 4 x 4 (2 Administration Console servers, 4 Identity Server servers, and 4 Access Gateway Appliance servers)

The failover testing includes the HTTP/HTTPS traffic continues after a component failover scenario

A.1.2 Test Setup

Server Hardware for Access Gateway Appliance

Access Gateway clustered tests are run on an virtualized environment setup containing the following servers:

  • Dell PowerEdge R730xd running ESXi 5.5

  • Dell PowerEdge R720xd running ESXi 5.5

  • Dell PowerEdge R710 running ESXi 5.5

  • Dell PowerEdge R710 running ESXi 3.5

The design of the virtual machine is as follows:

Server Components

Operating System

Hardware

Administration Console (2 nodes)

SLES11 SP3

CPU: 2 x 3 GHz and Memory: 4 GB

Identity Servers (4 nodes)

SLES11 SP3

CPU: 4 x 3 GHz and Memory: 16 GB

Access Gateway Appliance (4 nodes)

SLES11 SP3

CPU: 4 x 2.6 GHz and Memory: 16 GB

External eDirectory user store (3 nodes)

SLES11 SP1

CPU: 2 x 3 GHz and Memory: 4 GB

Apache2 Web Server (3 nodes)

SLES11 SP1

CPU: 2 x 3 GHz and Memory: 4 GB

Server Hardware for Access Gateway Service on SLES 12

The Access Gateway clustered tests are run on an virtualized environment setup containing the following servers:

  • Dell PowerEdge R730xd running ESXi 5.5

  • Dell PowerEdge R720xd running ESXi 5.5

  • Dell PowerEdge R710 running ESXi 5.5

  • Dell PowerEdge R710 running ESXi 3.5

The design of the virtual machine is as follows:

Server Components

Operating System

Hardware

Administration Console (2 nodes)

SLES 12

CPU: 2 x 3 GHz and Memory: 4 GB

Identity Servers (4 nodes)

SLES12

CPU: 4 x 3 GHz and Memory: 16 GB

Access Gateway Service (4 nodes)

SLES12

CPU: 4 x 2.6 GHz and Memory: 16 GB

External eDirectory user store (3 nodes)

SLES11 SP1

CPU: 2 x 3 GHz and Memory: 4 GB

Apache2 Web Server (3 nodes)

SLES11 SP1

CPU: 2 x 3 GHz and Memory: 4 GB

NOTE:In this performance testing, Access Gateway is installed on SLES 12 servers with BTRFS as a file system. Identity Server is installed on SLES 12 with EXT3 as a file system (upgraded from SLES 11 SP3 to SLES12)

Server Hardware for Access Manager Appliance

Tests are run on a virtualized environment setup containing the following servers:

  • Dell PowerEdge R730xd running ESXi 5.5

  • Dell PowerEdge R720xd running ESXi 5.5

  • Dell PowerEdge R710 running ESXi 5.5

  • Dell PowerEdge R710 running ESXi 3.5

The design of the virtual machine is as follows:

Server Components

Operating System

Hardware

Access Manager Appliance (4 nodes)

SLES11 SP3

CPU: 8 x 3 GHz and Memory: 32 GB

External eDirectory user store (3 nodes)

SLES11 SP1

CPU: 2 x 3 GHz and Memory: 4 GB

Apache2 Web Server (3 nodes)

SLES11 SP1

CPU: 2 x 3 GHz and Memory: 4 GB

Load Balancers

The following L4 switches are used as load balancers for the testing:

  • Zeus ZXTM LB (software L4 switch)

  • Brocade ServerIron ADX 1000 (hardware L4 switch)

  • Alteon 3408 (hardware L4 switch)

Configuration Details

  • HTML pages are of approximately 50 KB with 50 small images embedded for all public page tests.

  • A small HTML page of 200B with one hyperlink is used for authentication, authorization, identity injection, and form fill performance tests. These tests do not cover the page rendering performance.

  • Access Manager user stores configuration contains 20 threads with 100,000 users in a single container. Multiple containers received the same performance, however these tests have been conducted with optimization and fast hardware. If you do not optimize and increase the speed of your hardware, performance will decrease. The primary user store used in the tests is eDirectory 8.8.6.

Performance, Reliability, and Stress Tools

The HP Mercury LoadRunner tool is used for Identity Server and Access Gateway testing. This tool correctly replicates large IP ranges between multiple clients in a clustered environment. This allowed the tests to simulate real-world environments more closely with a real browser interaction with Internet Explorer and Firefox.

The following are the specifications of the LoadRunner tests:

  • The virtual user has 500 threads among 17 clients. This is the optimal amount of threads before the system started to receive excessive login times.

  • HTML-based scripts describing user actions have been used. This is listed under the recording level and the HTML advanced option. This type of script helps to clear the cached data inside the script and downloads all data linked to the page.

If you do not have a sufficient IP address setup for LoadRunner, you must use solid load balancing on the Layer 4 switch. You must have parameters for the users so that you do not use the same user for every connection.

A.1.3 Other Factors Influencing Performance Information

In addition to the hardware and the test configuration described in the previous sections, the following factors in a network also affect the overall performance:

  • Customized Login Pages: Login and landing pages play an important role in the user experience while accessing the resources protected by Access Manager. Consider the performance aspect of the page loading and rendering while developing the custom login JSP pages.

  • L4 Switches: A slow or incorrectly configured switch can severely affect performance. System test recommends to plug clustered Access Manager components directly into the switch or to segment accordingly. Enabling sticky bit/persistence on the L4 switch is also important. When sticky bit/persistence is not enabled, the product can handle the traffic correctly, however may become slower up to 50%.

  • Network Bandwidth: Gigabit copper networking is used throughout the testing process. This is a requirement for the product to meet the testing results. If you are running at 100 MB or have a slow Internet connection, the product cannot solve this Performance bottleneck.

  • Web Servers: The application servers are a major cause for slowness because they process most of the information. The tests used static and dynamic pages with more than 50 images. The tests are based on the real-world traffic to give a general idea of response times less than one second. The public requests can vary widely based upon the size of the page, caching settings, and content.

  • LDAP User Stores: This component can cause slowness depending upon configuration, hardware, and the layout of the directory. The user store is the most common problem with performance. Therefore, testing must be done with the LDAP user stores that is used in the environment. Expect adjustments if you are attempting to get the maximum speed out of the cluster for the different LDAP user stores. eDirectory is primarily used throughout the testing to give a baseline for the product.

  • Timeout: If you run a performance test, you must factor in sessions that are stored on the server. The tests have a 5 minute timeouts so that the tests do not overrun the total users on the system of 100,000 active sessions on the cluster. You must consider this while planning for capacity testing on a cluster. Configuring the session timeout for a resource is dependent on the security requirement. If security is not the concern, the following are some of the recommendations to fine-tune the session timeout configuration to reap the best performance:

    • If users access a protected resource for a short duration and leave the session idle after accessing few pages, configuring a short session timeout for such resources is recommended. This enables the system to remove idle sessions faster from the system.

    • If users access a protected resource for a long duration, configuring a long session timeout is recommended. It reduces the internal traffic to update the user access and improve the overall performance of the system.

  • Users: Ensure that you have enough users on the system to run the performance test. If you run 50 threads of logins against Access Manager with each one using the same user to authenticate, Access Manager matches each user and handles all 50 sessions as the sessions of one user. This skews test goals and results, because it is not a valid user scenario and invalidate the test results.