Wednesday, December 19, 2007

Business Case for Testing Tools & Automation

This post had been lying in my drafts for a long time. For some reason, I decided not to publish it. But it's still relevant...except that I have submitted the business case since then and it's lying at somebody at Finance's desk or email Inbox for review/approvals.


I'm currently working on a business case to invest in testing tools and functional test automation. The case will be sent to upper management and finance for approval and once they do, we're going to start working on building an automation framework. I foresee a lot of hurdles and am not keeping my hopes high based on the tightening of budget and how the business case is coming out to be.

One of the things that I have to add in the business case is financial analysis which includes an estimate on return of investment. And it has been a very "enlightening" process. Based on the current estimates on the capital and labor costs, it'll take us 2.9 years to get a return on investment. And of course, it's based on a lot of assumptions and I'm kind of skeptic when claiming that number for ROI.

You see, one of things that are never in short supply at my workplace is the list outstanding tasks. And with that comes extreme work pressure and very little time to invest in exploring new technologies and coming up with new ideas. I'm quite sure that designing the automation framework will be a time consuming and learning process and I will have to be devoted at least half-time (if not more) in this effort. And I'm quite sure that when we actually start working on that effort, the estimates that I put in for labor in the business case will start looking like best case scenario.

The brighter side is that I have now under my belt the experience of creating a business case. And once/if approved, we'll be undertaking the effort of functionally automating the test cases for our applications. We're also planning to expand our QA service by offering the testing tools and/or automation to other teams.

My colleague and I had been talking about the design of the framework since we first conceptualized the idea of automation. We had this whole idea in our minds on how to make it reusable, to maximize customizability with least rework. He has experience in building these kind of frameworks and I have had a brief encounter at my last job. I recently looked up online and found that it is now one of the established practices, one of the buzzwords surrounding test automation process - Keyword Driven Test Automation Frameworks. It must have been a long time since both of us were in automation business because what I read matched what we had been planning to do. Seems like a lot of people have written their experiences with building automation frameworks with this idea. Seems like we'll be able to use some of the knowledge to our benefit.

Coming up...

A few weeks ago, I spent some time going over JMeter and building some advanced web test plans. But then I got distracted and buried under some intense projects, including a 1-week training on a product that I didn't know much about and that I have to be an advanced user of within a few weeks. (BTW...if you want to know why attending a 1-week advanced training on a product without having a solid background is a bad idea, let me know).

Now I have some time again but it's right before my vacation and I just don't feel like getting immersed into it again. But when I come back, I plan on posting some of my notes on how to build an advanced web test plan in JMeter. I plan on using a more advanced scenario than the one with basic Get requests, probably something similar to my previous post about using an MD5 library to create a LoadRunner script. If I can't find something else, I'll use that very example. I also plan on exploring some webservices and pure XML over HTTP kind of scripts. Stay tuned...

Oh... and I should also mention my reasons for looking into JMeter. It has been increasingly frustrating to borrow time on LoadRunner Controller from an external team, partially because of having to justify to the project team the additional charge back incurred to the project. I can't believe how many emails I have to write to justify a few 100 dollars. But anyway, turning misery into a learning experience, I wanted to explore if JMeter provides enough capabilities and is robust enough to be used for production performance testing.