About Major U.S. Bank
In the highly competitive banking industry, applications are a key differentiator. But for banks to keep abreast of consumer expectations, they need to dramatically improve their applications testing capabilities. One top four bank has done just that by implementing Service Virtualization, a Micro Focus solution that allows the banks to simulate complex and constrained applications environments cost-effectively – and capture potential performance issues more effectively.
Automated Testing and QA Solutions Vital – But Challenges Remained
SV is not the first Micro Focus tool the bank embraced within its QA group. It uses Micro Focus Applications Lifecycle Management (ALM) software to manage test cases, requirements, and defect logs. Micro Focus Performance Center (PC) is the bank’s enterprise solution for performance testing, with Performance Center software functioning as its performance testing tool. Micro Focus Diagnostics and Micro Focus SiteScope software are also deployed within the bank’s testing environments as monitoring tools, to help QA further validate application performance and scalability.
The bank has used many of these tools for years – in some cases, over 16 years – and today they are the foundation of the bank’s enterprise test center, enabling the bank’s applications development and QA processes to achieve a high degree of sophistication and discipline. This, in turn, allows development and QA to become more responsive to business requests. Within recent months, for example, the bank has transitioned much of its development processes from waterfall to agile, something that wouldn’t be possible without centralized visibility into requirements and test status.
Despite these advances, however, the bank was still challenged to keep up with testing demand – particularly for applications that integrate with services from other companies.
Some of the bank’s software, for example, exchanges data with another financial institution’s mainframes. Testing those applications has been an ongoing challenge and fraught with delays. “They give us access to their mainframe for testing,” the Manager explains, “but only 6 to 10 hours a week, and all during non-business hours.”
These limitations made it difficult for the Manager’s team to complete its performance testing in a timely fashion – and the only alternative, standing up a duplicate environment, is cost prohibitive. “It would probably cost us as high as $3 million per month,” the Manager recalls.
So historically, the performance testing team and engineering teams improvised. Some teams built simulated web services using open-source software. Others deployed off-the-shelf software, but there was no standard enterprise solution.
These stop-gap measures helped, but they were also time-consuming and resource-intensive. It often took people up to 50 hours to build simulators, an effort that typically added 2 weeks to their testing cycles. And the simulators were often limited in functionality, which left the bank vulnerable to the possibility that it would release software with unknown performance issues.
Required Functionality, Ease-of-Use, Lower Cost
Then Micro Focus introduced the bank to Service Virtualization software, and the Manager saw the value and decided it was time to implement a robust, enterprise service virtualization solution.
Before committing to SV, however, the bank did its due diligence. It compared the Micro Focus solution to other available products, particularly CA LISA, because some of the performance testers had experience with that tool. The bank ultimately concluded that SV was the best available service virtualization solution. First, SV supports the service virtualization functionality the bank’s performance testers needed. These virtualized services can be executed as part of performance test scenarios and can emulate performance degradation as well as the functionality. A second critical consideration was the solution’s ease of-use. “The skill set required to use CA LISA is greater than for the SV software,” the Manager notes. SV software can be supported by the PT team skillsets and also be integrated with ALM and Performance Center software.
And finally, SV also costs less than CA LISA. “We realized that SV delivers comparable functionality at a lower price,” says the Manager. “That made our business case.”
Micro Focus Professional Services Ensures Proper Configuration, Sizing
After the bank committed to SV software as its service virtualization solution, it engaged Micro Focus Professional Services to help with the implementation. “We consider them to be some of the best talent in the industry.” And while the Manager’s team is itself a highly competent testing organization, it made sense to leverage Micro Focus’s expertise to design and build its SV platform.”
Micro Focus delivered on that trust: the platform it built – 25 VMware virtualized servers running on HPE ProLiant BL620c Server Blades and HPE ProLiant BL460c Gen8 Server Blades – has since proven an excellent fit for the bank’s virtualized performance testing workload. “Within a short time of implementing the platform, we were simulating well over 100 services on it. But even at that level of use, we had bandwidth left over for growth, without having to invest in additional infrastructure.” It helped that throughout the evaluation process Micro Focus continued to improve the performance of SV so that now the infrastructure we specified can support twice the initial volume anticipated. This was an added bonus since interest in this platform as a service for other areas has exploded.
The Manager estimates that his team of 53 testers will handle around 350 test engagements in the next year, representing around 100,000 to 150,000 hours’ worth of performance testing.
Flexible Solution Supports Multiple Use Cases
Because SV doesn’t have a steep learning curve, the bank’s performance test team adopted the solution quickly, leveraging the solution in several different ways.
In some cases, the testers work with the bank’s vendor partners to simulate their services’ functionality. For this category of use case, the testers create virtual replicas of third-party services within SV and then test how the bank’s applications perform when they interact with those services.
In other cases, it’s impractical to replicate a third-party’s applications environment. In these instances, the bank takes a black box approach: it uses SV to create simulators that perform as stipulated by the bank’s contracts with its vendors. This approach effectively minimizes the bank’s exposure to risk. It allows the bank to validate, to its own satisfaction, that its applications meet its performance requirements. Then, when the applications go into production, the bank’s vendors are responsible for ensuring their services meet their respective contract terms and service level agreements.
Another way the bank now relies heavily on SV is for component level performance testing. This is particularly important for testing real-time messaging (RTM) web services. “We use Micro Focus LoadRunner within our SV simulators to validate the performance of applications at RTM levels,” the Manager explains. This supports the bank’s agile development methodology because the performance tests can be conducted before upstream applications that leverage the RTM components have been fully coded.
A fourth use case addresses applications that pull from large, shared databases. “With SV, we can simulate large data sets, or environments that draw from multiple databases, more easily and quickly than if we had to replicate the data sets themselves,” the Manager says.
Reduced Testing Times – Better Testing Outcomes
Since leveraging SV, the bank has achieved a number of significant improvements in its performance testing capabilities and outcomes.
Instead of the 50 hours once required to build simulators, with SV software, testers can build them in around 4 hours: an efficiency gain of around 80%. This, in turn, allows the performance testing team to shave weeks from the time required to validate application performance and frees them to do other tasks, including increasing test coverage and quality.
SV also enables the bank to optimize how it allocates its testing resources. Before, it needed performance engineers to build its simulators. SV, however, is easy enough to use that performance testers can build simulators themselves. This frees the engineers to focus on analytics and other high-value tasks. It also enables the bank to lower its performance testing costs by around 60%, because it no longer has to allocate to testing tasks the engineers’ time, which is more expensive.
The performance testing team is also more responsive thanks to SV. “Because we can create simulators more easily and quickly, we can respond more quickly when developers send us code,” the Manager says. “It’s a better fit for agile development.”
Perhaps most important, however, is how SV has improved the efficacy of the bank’s performance testing – without incurring the costs associated withstanding up physical copies of critical services platforms. “We’re catching performance issues that we couldn’t identify with the simulation tools we used before SV,” the Manager states.
In one notable example, the Manager’s team re-tested an application using SV and caught an issue that it had previously missed. The team first tested the application before it implemented SV; the application seemed to run exactly as intended using an open-source simulation tool. Then, shortly after the application was launched, the bank decided to add some new functionality, and when the performance testing team ran the new version in an SV simulator, it discovered an issue. Under certain conditions, the application’s response time slowed from an acceptable three seconds to around 25 seconds, this was masked by leveraging the other tool.
“We were lucky because the application was still new. Its user base wasn’t very large,” the Manager explains. “But eventually it would have been noticeable. Application response times would have slowed to levels that our customers would have found frustrating. This is unacceptable in the banking market where customers can move their business at will.”
Instead, the bank’s developers were able to re-code the software to fix the issue – without any impact to the bank’s customers. “We wouldn’t have caught this with the simulators we were using before we switch to SV,” the Manager says.
Experiences like that demonstrate clearly that the bank made the right decision when it adopted SV. “Our applications today are more complex than ever,” the Manager concludes.