Is validation of GxP applications compatible with continuous delivery (CD) of enhancements and patches? Can users reap the benefits of lower time to market? Yes, but only if we apply the techniques of CD to break the acceptance testing bottleneck.
“I am sorry, but productivity enhancements don’t justify the cost of a new release.”
“Sure, our two-year-old version is out of support. It’s cheaper to pay the vendor for custom support than it is to validate a new version, and to re-validate all the downstream applications.”
“The security risks of the old version are manageable for a few months. The business is tied up with other priorities at the moment, and we can’t spare anyone for acceptance testing the new version.“
Sound familiar? We who build, install, and test applications regulated by Good Manufacturing/Laboratory/Clinical Practice (GxP) guidelines contend with higher costs of deployment than our friends in other industries. Before we deploy a new release we must perform acceptance testing to verify its ability to meet consistently our documented requirements. We must often repeat this testing for configurations at multiple sites. We must plan, execute, document, and show evidence of this testing to comply with our documented procedures and with applicable regulations. This acceptance testing is ultimately the responsibility of the customer, not the vendor.
User acceptance testing, by definition, requires the participation of, well, users. For GxP applications, it pulls manufacturing specialists, statisticians, study coordinators, researchers, or other high-value resources away from their day jobs for weeks or months to write and execute manual acceptance tests, and to document the results. The business pays huge opportunity costs in terms of lost productivity in their normal revenue-generating activities. The benefits of the new release had better be worth the cost.
Even with slow traditional release cycles that introduced software updates every 3, 6, or 12 months, skipping or delaying deployment because of these costs meant deferring any business value those updates might bring. Figure 1 charts the hypothetical business value of a system (the y axis) over time (the x axis). The potential business value of the system grows, more or less, with each subsequent published release (in red). However, customers only realize the business value to the extent they deploy those published releases to production (in blue). The differences between the areas under these two lines represents the unrealized business value of the software the customer leaves on the table – presumably because, in their calculation, the cost of verifying and deploying each release outweighs its expected value to the business.
And yet the state of the art of software engineering trends in the opposite direction, toward continuous delivery. Continuous delivery (CD) aims to release to production incremental improvements to complex software in short cycles, without sacrificing quality. With CD, every change to the code base triggers a pipeline of automated processes that build and package the software, deploy it to a production-like environment, and test it. Every change that successfully clears this gauntlet becomes a release candidate. Further automated deployment processes release the candidate to a production environment, itself controlled by automated configuration management processes.
In their seminal book Continuous Delivery, Jez Humble and David Farley write:
Software delivers no value until it is in the hands of its users. This is obvious, but in most organizations the release of software into production is a manually-intensive, error-prone, and risky process. While a cycle time measured in months is common, many companies do much worse than this: Release cycles of more than a year are not unknown.
CD, by contrast, reduces time to market for each incremental enhancement or bug fix. Practitioners embracing CD are deploying tens (Flicker), hundreds, or even thousands (Amazon, Netflix) of releases per day. But releases deliver zero value while deployment constraints keep them out of the production environment. Revisiting our graph in Figure 2, below, CD smooths the top line by shrinking release cycles and delivering always-increasing potential business value. Keeping the same old bottom line, constrained as it is by deployment constraints, we see that the amount of unrealized business value left on the table (both shades of red) increases. From a cost-benefit standpoint, skipping releases becomes harder and harder to justify.
Can businesses deploying GxP applications reap the benefits of CD? Obviously, much depends on whether software vendors embrace CD. If they are releasing high-quality software in short release cycles, and making those releases available to customers, then those customers must ensure that their deployment processes can keep pace. This requires vendors to deliver software amenable to automated deployment and automated installation qualification – that is, to build an automated deployment pipeline. A colleague of mine at KSM was engaged to support the integration of GxP software that required – by the vendor’s own admission – 21 days of manual procedures to install, assuming that everything went as planned (it didn’t). That vendor has since moved their solution to the cloud. Virtualization, micro-containers, and/or qualified cloud infrastructure can be of enormous value in automating repeatable deployment and qualification, but those are topics for another day.
What of acceptance testing? As Figure 1 illustrates, even without CD, we leave a lot of unrealized business value on the table when deployment constraints like acceptance testing are allowed to block the pipeline. With CD, it gets worse; the faster CD delivers release candidates into your deployment pipeline, the faster they will pile up behind the bottleneck of acceptance testing. The solution, as with all things CD, is to leverage automation to relieve the acceptance testing bottleneck.
And yet automating user acceptance testing entails unique challenges. Automated unit, integration, and system tests can be written and executed by developers translating functional requirements into code. The users writing acceptance tests – those same researchers, study coordinators, or specialists pulled from their day jobs at great cost – are most likely not programmers. They can work with developers to code their tests, but that increases costs and decreases efficiency. If they cannot read code the developers write, they cannot sign off on the final product.
We created Toffee to allow users without coding experience to automate acceptance tests using simple, intuitive commands. For those commands that cannot easily be automated, manual commands – injected right into the middle of an otherwise-automated test – let the tester take over as much or as little as needed. Testers experience the benefits of automation, yet without relinquishing control, and the assurance of correct operation that control brings. Toffee stores tests, results, and screenshots online, and maintains a full audit trail of all changes, thereby greatly reducing the documentation burden associated with acceptance tests. By replacing manual acceptance tests with automation, Toffee mitigates one of the largest deployment constraints for GxP applications, and thus helps to enable CD.
Create your free Toffee Composer account today, and find out how easy it is to automate user acceptance tests, and capture the benefits of continuous delivery for your GxP web applications.