By Venkata Ramana Lanka
Cloud as a technology is no longer new. Many organizations have already openly adopted cloud-based services or are in the process of migrating existing services to cloud. One key consideration that development teams must carefully think about during this process is how they will test the success of cloud migration.
When selecting a cloud testing strategy, it is imperative that teams carefully select which software development model they want to use. The model chosen can significantly influence an organization’s entire approach to cloud testing.
The three most popular software development models include the use of the cloud to host the production application; development outside of the cloud environment, testing and deployment on cloud; and development, testing, and deployment all done on the cloud.
Benefits and Limitations
Here, we discuss the benefits, limitations, and strategies for each software development model for testing cloud migration.
The first model we’ll discuss is the cloud used only to host the production application. In this model, quantum of testing is restricted since any test will directly impact production. Organizations following this model tend to run all of their tests outside the production environment before deploying on cloud for customer access.
From a testing perspective, this model does not provide any significant benefits.
In regard to limitations for testing on production, mostly key regression tests are run to identify a broken user interface. Tests requiring interaction with a system’s database—including login validation, data entry, and data modification—will not be allowed, as any changes can impact production data. Additionally, key features of the cloud—including on demand self-service, broad network access, elasticity testing, and measured service—are difficult to test, resulting in potential disruption on production.
Though risky and not advisable, in certain situations test logins can be created in production and shared with a restricted audience to run E2E functional tests. However, performance testing should be avoided to eliminate production failure. One should explore the possibility of deploying the application on a cloud instance that is similar to the production environment to get a sense of how it will work. Production deployment can happen once the test team runs their tests on this environment, including integration testing and performance/stress and scalability testing.
The second model is when development happens outside of the cloud environment, while testing and deployment happen on the cloud. In this model, code is built in house and deployed on a test server hosted on the cloud. Fully tested code moves from the test environment to the production environment. This is the most commonly used development model.
Since deployment and testing happen on cloud, test teams are able to test all functional features of the application. Additionally, all of the cloud-specific features to cloud like scalability, elasticity, measured services—pay-as-you-use models, and security, can also be adequately tested.
One of the biggest limitations of this model is that defects may be detected late in the lifecycle. Certain defects specific to cloud infrastructure are difficult to reproduce in the development environment. Certain types of defects related to elasticity and measured services may not be reproducible in a non-cloud environment.
To overcome these limitations, avoid deploying code on the test server from the development environment. Instead, the versioned code should be deployed on an instance of cloud that is exactly the same as the test environment, but accessible to the development team. Code should be moved to the test environment only after the deployment team runs a sanity test and ensures smoke tests are successful. This will make it easy to reproduce defects.
The final model involves development, testing, and deployment—all performed on the cloud. While this model not yet widely embraced by software organizations it has been successful. Concerns such as data security, privacy, and complete dependance on the cloud provider hinder adoption of this model.
From a testing perspective, this is the best model as teams do not need to validate between different environments. Full-fledged continuous integration testing runs through automated tests, so performance testing can be seamlessly conducted. What’s more, managing identical environments, between development and testing teams, eliminates potential issues with environment mismatch. From a testing perspective there is nothing significant that can be considered as a major limitation in this model.
Since this is a fundamental shift from current application development models, training for all the team members help bridge skill gaps and allow for greater success.
When it comes to cloud testing, remember that the cloud does not change the traditional testing process adopted by different organizations. Fundamental test processes, consisting of test planning and control, test analysis and design, test implementation and execution, evaluating existing criteria, and reporting and test closure activities, are equally applicable in the cloud context. The business process testing approach will not differ, however non-functional aspects of testing, like application performance testing, security testing, interface testing, compatibility testing, and test environment management—environment provisioning, environment availability, and environment release—are impacted when switching to cloud.
Most leading cloud service providers offer the necessary tools for running functional automation, performance, security, and other types of testing. These tools are offered on a subscription-based model and can be purchased on an as-needed basis. In addition to the tools offered by cloud providers, testing teams should also consider using other commercial and open source tools that are cloud enabled.
Cloud testing does not require an entirely different approach to testing, rather it requires a basic understanding of the technology in order to design and run tests that are different from a typical Web-based application testing. Focused training on the fundamentals of cloud and associated deployment models is good enough to start equipping your test teams to learn the skills and effective test cloud-based applications. SW
Venkata Ramana Lanka (LRV) is the director, QA, Virtusa is an accomplished software test management professional with an extensive background in software testing, test automation framework design, building domain specific testing solution accelerators, leading large software testing teams and supporting presales initiatives. LRV is an hands-on manager with proven ability to direct and improve quality initiatives, reduce defects and improve overall efficiency and productivity. He has in-depth and proven expertise in developing and implementing test strategies and operational procedures. LRV has extensive experience in working with multiple commercial and open source test tools. As a QA thought leader, LRV has written multiple White papers and articles some of which have been published in various media. Further, he has spoken and presented papers at various software testing conferences.
Dec2015, Software Magazine