QualiTest Validates Media Company’s Entry into Service Oriented Architecture

QualiTest Validates Media Company’s Entry into Service Oriented Architecture

The client decided to implement a Service Oriented Architecture (SOA) model that identified over 100 web services and an Enterprise Service Bus (ESB) involving 17 new and existing systems to communicate with each other.  The testing would involve testing the calls to the ESB, the business rules that determine the service(s) going to the downstream systems and the business rules that translate the responses back from the downstream systems into the responses going to the originating system.

Client Overview

The client is a leading media/entertainment company with a variety of subscription-based products.  Our relationship with this client started ten years ago and includes a variety of new and on-going projects across a variety of technologies.

 

Business Needs and Objectives

During the design phase of a new product line, the client’s architecture group decided to implement a Service Oriented Architecture (SOA) model. The model identified over 100 web services and an Enterprise Service Bus (ESB) which would allow 17 new and existing systems to communicate with each other. As a trusted partner, the client contacted us for a test team that could handle all of the testing of these services across the ESB. There were a variety of service implementations from SOAP to REST to JMS queues, with some web services having a REST interface that would be converted to a more secure SOAP service. The testing would involve testing the calls to the ESB, the business rules that determine the service(s) going to the downstream systems and the business rules that translate the responses back from the downstream systems into the responses going to the originating system Additionally, due to some of the downstream systems being provided by a third party, there were very tight deadlines that could not be missed. 

 

The QualiTest Solution

In looking at all of the challenges with the testing effort of this new project, we knew that we would have to focus in three areas; first creating automated tests as soon as possible, standing up mock services and managing the state of the data in all of the 17 systems.  We knew that the automation would be needed from day one because every time a new service was added to the ESB, we needed to regression test all of the other services. The mock services were needed as stubs for the downstream systems when they weren’t working or available when needed. We also set up the mock services to provide all of the possible error messages from the downstream systems. This was the most efficient/effective way to test all of the error conditions.  An ESB and web services are typically stateless transactions so we also knew that once we connected to the downstream systems, we needed to keep strict control over the data that existed in each of the 17 systems in order to ensure valid tests.

Our process began by developing automation accelerators that could parse SOAP WSDL’s, mapping all input fields (mandatory and optional) then automatically building a repository of possible test cases that included both positive and negative test cases.  We then added more business logic test cases to the automation suite by simply adding rows to the test case spreadsheets including the data required for each request, the expected response and the validation points.  The automation accelerators for the REST requests were built from parsing sample requests.

Testing initially included the use of randomly generated test data and existing test data.  Once the product went live, we started including sample  redacted production data. The sample from production allow us to provide more complete test coverage, including all of the possible states of data which we can’t always replicate in QA and also allow us to test more edge cases.

We also set up mock services as soon as we got a WSDL or REST request definition from development. This allowed us to immediately create and test our automation scripts in parallel with the development effort.  As soon as the developers deployed a new service we could run our automation and provide immediate feedback. Mock services were also set up for the downstream systems which gave us flexibility in testing positive and negative scenarios without a lot of data set up across the various systems.

As a result of our automation and testing process, a new web service (we are currently testing over 150 web services) can get thoroughly tested, automated and deployed to production within a 2-week sprint.  Our approach also covers testing in multiple environments, enabling compatible-version web services testing. This ensures that component versions are validated together to confirm that they can be successfully deployed together.

 

Key Benefits

  • Guarantee successful SOA implementations and reduce quality issues by creating and automating them as soon as the service is defined
  • Increase test coverage earlier in the process by testing on a component level and using mock services
  • Reduce test planning and automation time by having our test accelerators auto-generate test scripts and test data
  • Cover more data states and edge cases by using sample production data along with our generated test data