XML, SOAP, REST Testing for SOA and Cloud Computing

SOA Testing

Subscribe to SOA Testing: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get SOA Testing: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

SOA Testing Authors: Mamoon Yunus, John Savageau, Tim Hinds, SOASTA Blog, Hervé Servy

Related Topics: Java EE Journal, SOA & WOA Magazine, ERP Journal on Ulitzer, SOA Testing

J2EE Journal: Article

SOA Testing Framework

A framework to test your services in service-oriented development

Service-oriented architecture (SOA) has become today's technology buzz and it's rapidly becoming a mainstream approach to enterprise systems design. Beyond the buzz of SOA, organizations face several challenges as they attempt to truly effectuate the paradigm shift towards SOA. One critical challenge is: How can we assure the quality of the business services that we build? Can the services we build withstand the test of rapid organizational change? One way to address this challenge is through the use of effective testing methodologies and tools for the services deployed under an organization's SOA fabric.

A new development methodology - services-oriented development of applications (SODA) - is emerging to accompany the paradigm shift toward SOA. Today's agile software teams need effective tools to smoothen the transition. This article analyzes, designs, and demonstrates a tool that can prove invaluable to organizations implementing SODA: an automated services testing framework.

This article will demonstrate an automated SOA testing framework. Founded upon J2EE, XML, and SOAP as well as open source frameworks including Ant, JUnit, and XMLUnit, this powerful tool can prove invaluable to the agile organizations that are implementing SOA. Using the tool, organizations can both unit test for fine grained, "atomic" services as well as integration test loosely coupled, coarse grained business services that orchestrate the granular services. Since change is constant, development teams can automatically execute the services' original set of test cases to ensure quality immediately as peripheral changes are continually being made. The framework has the following features:

  • Completely XML configurable
  • Advanced test input configuration (XML input, database state management)
  • Sophisticated output verification capabilities (for both success cases and errors)
  • Complete integration with Ant builds
  • Automated JUnit code generation and execution
  • Easy-to-read HTML report generated as output
The Need for Automated Services Testing
In any software development project, testing requires significant time, effort, and discipline. The following common steps should be followed to assure proper and efficient testing:
  • Define a test plan that outlines the testing process and exit criteria
  • Derive test cases from use cases or business requirements
  • Generate test data and/or scripts for each test case
  • Outline the expected results for each test case
  • Execute the test cases
  • Verify whether the results of the test cases match the expected results
  • Generate reports to measure the software's quality against the test cases
  • Fix/resolve remaining defects in the software
  • Continue executing/verifying/fixing until all test cases succeed and defects are resolved
With the advent of extreme programming and agile development methodologies, organizations are advancing their testing efforts dramatically. Rather than employing error-prone, time-consuming manual tests at the end of development, many teams are now automating testing during - and often before - development. Code is integrated nightly and run against suites of automated tests to continuously verify quality.

Testing services in a SODA environment is not very different from testing ordinary software components; the main difference is that services are generally reusable implementations of business processes that orchestrate the functionality within lower-level components and other finer-grained services.

Services are highly amenable to testing. Services have well-defined interfaces and hence well-defined inputs and output. Their behavior is therefore predictable enough to apply commonly used black-box testing methods that are very amenable to automated testing.

Automated testing of services will become crucial to organizations that are undertaking SOA initiatives. Enterprise environments are constantly changing. For example, at one time, an organization may use JD Edwards as its ERP for managing orders. Then a merger happens, and suddenly, the underlying ERP is standardized to the parent company's Oracle ERP. In an SOA world, the underlying implementations of the order management services will have to be swapped. However, the service interfaces should remain the same. (More commonly, O/S patches are frequently installed, service packs are applied to application servers, small fixes and enhancements are developed, and so on...)

Because change is constant, development teams should be able to automatically execute the services' original set of test cases to ensure the quality immediately as changes are being made. This will save untold time and effort, and will remove the error-prone elements of the process.

Typically, from the broad business requirements, individual service use cases are created. This will lead to service design and implementation. In a Test Driver Development (TDD), test cases are created from the use cases or service design well before the construction of the services. Test cases should be generated to test out a number of things such as decision points, validation rules, varying system states, business rules, and other expectable conditions. Figure 2 shows the transformation from service design to test cases to account for all of these parameters. Typically hundreds of test cases could be created to test the system thoroughly. Any changes to any of these parameters will mean it's necessary to run all of the tests all over again. This clearly warrants the need for the automated testing.

Requirements for a Testing Framework
A framework that can effectively meet the needs of software teams that develop and test services in a SODA environment should have the following capabilities to allow you to:

  • Test services at a unit and integration test level (a unit test of a coarse-grained "composite" service naturally serves as an integration test for fine-grained services that are orchestrated by the composite service)
  • Invoke target services using a standards-based approach (SOAP)
  • Be very flexible and configurable to perform only the necessary steps in testing
  • Allow test cases to be configured - rather than coded - through a user-friendly XML configuration file
  • Leverage proven open source build/test technologies and frameworks (e.g., Ant, JUnit, XMLUnit)
  • Auto-generate test case classes/code based upon XML configuration
  • Provide options to adjust the underlying state of the service ecosystem as an alternate and corollary "input" to a test case (e.g., to test for different service behavior based on variances of data in the database)
  • Verify XML output of a service against the expected XML output defined for a test case
  • Ignore specific XML nodes/elements during output verification (for variable like sequence numbers or date/time)
  • Handle error conditions gracefully and return the error message to the caller in a standards-based way
  • Provide the ability to set the state automatically for every test case run
  • Verify error conditions against expected error code(s)
  • Execute all necessary testing actions from a single command line
  • Support running a batch of tests or one specific test at a time
  • Generate well-formatted, human-readable reports about the test results
  • Have the ability to integrate with automated build and testing frameworks like Cruise Control

More Stories By Anbarasu Krishnaswamy

Anbarasu Krishnaswamy has over 15 years of IT industry experience, nine of which were with BEA. In his current role as the Enterprise Architect Lead, he leads the enterprise architecture and SOA practices for the central region professional services at BEA. As a SOA practitioner, he has helped several customers with SOA transformation and implementation. His experience also includes design and development of Java/J2EE applications, client/server computing, Web development, and enterprise application integration (EAI). Anbarasu holds a MBA from NIU and an MS in computer science and engineering.

More Stories By Ravi Nagubadi

Ravi Nagubadi is a principal consultant at Sofbang LLC. He brings eight years of rich technology experience with a strong focus on solutions to critical business issues. He has consulted in a diverse set of industries such as healthcare, government, education, insurance, and financial services. His expertise includes portals, commerce, content management, business integration, and enterprise architecture.

More Stories By Rajeev Mahajan

Rajeev Mahajan is a practice manager with BEA and brings 15 years of deep IS technology experience to his skill of linking business drivers to technology. He has broad industry experience in healthcare, high tech, financial services, manufacturing, and retail. He also has implementation experience with portals, content management, CRM, e-Business, and enterprise application integration.

Comments (2)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.