Archive

Archive for the ‘Test Automation’ Category

Agile Defect Prevention | Part 2

June 24, 2012 2 comments

Continuation of https://davidjellison.wordpress.com/2011/09/23/agile-defect-prevention/

So you have your escaping defects under control and your team is looking to optimize further to become an elite agile team…what more can you do towards defect prevention? Just as WIP Defects (‘Work In Process’ Defects) are an antidote to Defects, TDD (Test Driven Development) is an antidote to WIP Defects. WIP Defects, although a great way to contain faults from getting into the customer hands, is still an anti-pattern to elite agile teams. These teams are test-infected and write more tests than application code, and catch most problems while authoring the application code.

Fine Craftsmanship

In an elite agile team, everyone writes and executes tests. Product Owners carefully craft use cases with well thought through acceptance criteria, and regularly validates both that the working application behaves as intended and is a delightful experience. Software Engineers write unit tests to assure solid code craftsmanship, performance tests in the sandbox to assure efficient basic application behavior, and manually inspect their user experience like a fine cabinetmaker inspecting the glide action of the drawers in the cabinet he is building. Quality Engineers work closely with the Product Owner to understand the intent of the features, and with the Software Engineer to understand the design and share test approaches, authoring and running tests along the way. Quality Engineers peer inspect tests with Software Engineers so that both are intimately familiar with the tests. Both Software and Quality Engineers regularly run unit and regression tests, and validate both performance and user experience in each context the application runs in.

Consider writing failing test cases instead of WIP Defects as a start down this path. I have found that Quality Engineers who are not use to developing tests early and conducting peer inspections of tests with Software Engineers, are initially uncomfortable by the idea of not writing defects. You need to document the defect somehow. The best documentation is the regression test and not a defect report. This will foster correcting the problem promptly as a failing test instead of a WIP Defect hand-off or scheduling an escaping Defect fix. Get into the habit of running regression tests for the feature being worked on often throughout the day in the sandbox environment, with visibility by the whole team, as part of the CI (Continuous Integration) cycle.

Measure cycle time and efficiency in defect prevention over defect counts. You may still need to track defect counts, fix/find rates, etc. (especially in a larger organization), however in an elite agile team these defect counts are so small that everyone in the team is aware of them. Defect prevention and correction are part of the cycle time to deliver the changes. Take the time to be clear on acceptance criteria, compatibility with standards and architecture, and stability of the code changes. Collaborate within the team such that progress on code design and development, and the tests to validate them, is well known. Elite agile teams write and execute the tests early and often that assure working software remains working.

Even in elite agile teams there are Defects that a found late in the iteration or are a larger problem than can be solved at the time of discovery to fix prior to declaring work done. This warrants creating an escaping Defect. The elite agile team carefully scrutinizes each escaping Defect for defect tolerance of the business and impact to the customer, before opting to let the escaping defect into the field. The intent of the agile life cycle is to add business value often, and in small enough iterations to foster continuous feedback, so it might be more important to deliver the change with the defect than delay delivery. An escaping Defect may survive past an iteration and still be held back to release into the field until fixed in another iteration.

Elite agile teams continually test with constant feedback of pass/fail results. There are no unknown failures left without attention to understanding the problem as it is introduced. Problems are corrected promptly and not let into the field for exposure to customers. The whole team is aware of test status through continuous integration practices. There are very few, if any, known defects in the field.

Advertisements

Quality Engineering Intern/CoOp Position

October 3, 2011 Leave a comment

As a Computer Science major (undergraduate or graduate degree programs), why should I choose Quality Engineering as part of my internship or co-operative work experience? I am studying in a computer science degree program to be a software developer and expect to have a developer position upon graduation from college. This post is a pitch to College Students to consider at least one Quality Engineer internship/CoOp job so that you become a more well rounded developer. You should find the job to be career enriching and offers you another career choice.

Zero BugsQuality Engineering (QE) is the practice of defect prevention (finding defects as the code is developed) as opposed to quality assurance (verification and validation after the code is developed), striving towards zero defects in the field. Quality Engineering is early acceptance test planning in agile product delivery teams, collaborative with developers for both exploratory and automated testing, and centers around continuous integration. Quality Engineering embraces traditional testing techniques of defect discovery and test coverage in parallel with developing features.

Quality Engineering as has come a long way over the years. Testing was once considered a downstream activity in a separate test group that pushed defects into a large backlog for developers to fix in subsequent phases. This was usually made up of minimally technical testers with experience and domain knowledge of the customer’s needs. Test automation in the past was a small group of highly skilled test developers who automated tests written by others, requiring stable code to be worthwhile automating. In contrast, the modern software tester has a technical degree (usually computer science), is embedded in a software development project team, is highly collaborative with the developer counterparts in the team, is skilled in traditional test approaches, is skilled in exploratory and automated tests (creating regression tests along the way), and is the voice of the customer throughout the feature development life cycle. Today both the developer and the tester are engineers working together to deliver working software in small iterations, where software is delivered in small testable chunks.

Quality Engineering is a role in Agile product delivery organizations that:

  • develop and maintain test automation infrastructure and processes
  • develop automated tests (functional, integration, and load/performance) for GUI (graphical user interface) and API (application programming interface) automated tests
  • contribute to continuous integration practices
  • exercise continuous deployment and both regression and exhaustive testing
  • apply customer domain expertise for comprehensive test coverage, and exploratory and fuzz testing
  • validate acceptance criteria that developed and re-factored code meets customer and stakeholder expectations
  • develop and work with test fixtures (software containers to test components in isolation) and mocking (simulated responses from third party and internal integration interfaces)

Quality Engineering role has become increasingly more developer oriented, by participating in:

  • code design and test planning reviews
  • early tests of newly developed code in developer sandboxes
  • develop, review, and use deployment automation scripting (including up-time deployment)
  • develop, review, and use monitoring and up-time reporting tools
  • assess load distribution for high availability, redundancy, and fault tolerance,  in hosted systems
  • test integration with third party services, and their mocks

Quality Engineering is a an integral part of scaling development. Agile development includes designing for re-use, routine re-factoring, unit tests (including mocking and stubbing for negative tests), and building integrated code frequently. Work is broken down into the smallest testable work items, such that development and test is a continuous practice. A combination of unit tests (authored by the Developer) and regression tests (authored by the Quality Engineer) provide a library of automated tests that run often and identify problems that break running software. This library of automated regression tests allows the planned iteration time (to visually test and develop additional automated tests) to remain constant and predictable.

Test Accumulation Graph

Regression Time in this graph is the time it would take to manually exercise regression tests as the code base increases over time.

Without well written regression tests, the amount of time it takes to test increases exponentially with the addition of new working code (Test Accumulation graph). While adding features to a product with each development cycle iteration, the time to test manually increases (usually linearly). Automated regression tests absorb the time to test, enabling the available test time to be focused on testing changes and writing new automated regression tests. The library of automated can be run over and over again frequently to promptly catch problem regressions (code or configuration changes that result in breaking another part of the system). Therefore, agile product delivery teams rely on the accumulation of tests as part of delivering working software to the customer.

Having experience working as a Quality Engineer will provide you with a deep understanding of core test techniques, a test perspective on continuous integration and continuous delivery approaches, a sense of what acceptance criteria means to consider development tasks done, and designing software applications for test-ability. CI and CD drive defect prevention, assuring that working software remains working. Quality Engineering is a different kind of developer and knowing this role well will help you work collaboratively with quality engineers while working as a development engineer in the future. You may also find that Quality Engineering is a career path that interests you more than Developer Engineering.

Constant Contact Quality Engineering is organized into roles of QE Architecture, a specialized test team as a center of excellence in test automation, and embedded quality engineers on each agile product delivery team. There is typically a 3:1 ration of developers to quality engineers whom operate collaboratively with developers to deliver new and re-factored infrastructure and customer facing features. Most of our Quality Engineers have a degree in computer science (or related major), many have an MSCS, and we even have a PhD in the team. Consider joining our Constant Contact QE organization for one or more of your internship or co-op work sessions. This experience will contribute to building a well-rounded developer education and a chance to learn the other side of delivery software.

http://ConstantContact.com/Careers/

Functional Continuous Integration

Driving agile practices over the last 4 years in 3 SAAS companies, it is quite apparent to me that continuous integration (CI) requires both unit test build failure verification and regular deployed functional regression test verification to be really agile. Yes, you need all the SDL (software development lifecycle) practices to manage building the right product and completing work items, but quality of work cannot be compromised in the name of speed. Agile is all about completing small amounts of working (deliverable) software and iterating on continuous feedback. It also includes confidence that you are delivering tested software without regression defects (not breaking what already worked) and confidence future work will not break what was just delivered. Any remaining tests not completed in the scope of work items is considered technical debt. This technical debt is postponed work that results in missed defects.

Quality confidence is achieved by routinely running automated tests at both the code and system levels. Regardless of the agile practices used, design, development and test are interwoven and requires collabration of development and test resources in the delivery team. I believe this is the secret sause that differentiates a waterfall-ish team and an agile-ish team.

  • The waterfall-ish team has the mind-set of develop application code first and develop automation test code later, frequently not including testing in the work item scope.
  • The agile-ish team has the mind-set of developing both unit test and functional test code along with application code, either prior to application code (Test Driven Development) or just after application code, but within the scope of the work item. This includes meeting work item (e.g. user story) acceptance criteria.

Functional Continous Integration (FCI) is continuously creating and updating automated regression tests, and must be the expectation for PO’s when planning work commitments, Executives when assessing progress reporting, Developers including collaboration with QE in their estimating, and Quality Engineers in planning and completing test work. Infrastructure for FCI needs to include integration of automated tests with a test management and reporting database, and needs to be capable of running unattended. I’ve used SeleniumRC with both CruiseControl with Rails test scripts and Hudson with Java test scripts to run build-time deploy and unattended test runs. These CI applications can run with multiple client machines as slaves. This allows CI jobs to run each test suite on a different client machine simultaneaously to speed up the test duration.

The result is that test failures due to problems that break existing code, introduced with changes or new application code, are caught very early and corrected. Further, if these tests are run in the Developer’s sandbox and corrected prior to check-in, there is no defect created, which results in significantly reducing defect counts for the agile team.

Continuous Integration: Selenium RC v.s. XUnit tests

December 26, 2008 Leave a comment

Since April 2008 I’ve been a Consultant/QE Architect at Sermo in Cambridge, MA USA on Ruby on Rails agile teams. The first team started as an experiment to prove rapid development of rails applications, composited with the JBoss-based java core community, could work seamlessly. We have continued to successfully add several more rails applications with this approach. We are now undergoing a major rewrite of the core community and it’s applications entirely in ruby on rails. This new design includes formal SOA interfaces. We are continuously refining our scrum lifecycle as well as our test automation approaches.

We have been focusing on continuous integration with Cruise Control and comprehensive regression testing, including Test:Unit and Selenium tests, an automated test plan generator, and ci_reporter test report, that run with every SVN commit. I also added nightly and weekly batch runs for runtime and more timely tests. The big issue we have been wresting with is…at what level {unit, functional, integration, runtime, browser DOM, load} should acceptance/regression tests be created? This question led to some interesting and healthy debate between development and test staff.

To summarize the testing levels…

  • Rails Test::Unit test levels:
    • unit test coverage is important to verify the methods and their paths
    • functional tests validate controllers operate as intended, including environment and database configuration
    • integration tests validate systemic operations that cross controllers, in render pages properly, irrespective of the browser, including AJAX responses for page load
  • Runtime tests are run on a deployed fleet, either headless or in a simulated browser DOM
  • Selenium tests exercise interactive AJAX and JavaScript in the page, requiring a separate client machine running the selenium-server.jar (Selenium RC for rails) or webrat gem (includes selenium-server.jar and the webrat IDL)
  • Jmeter for load tests (including performance counters)

We found that the most important part of a story is the list of acceptance tests. This list shapes and clearly defines the expectations of the story and what makes it complete. Many times we stub out tests for stories in the sprint in test suites at the beginning a sprint and can be implemented by either a test or development engineer. The biggest problem we found was that adding too many selenium tests made the automated build validation time increase dramatically (4 to 10 times that of integration tests), difficult for developers to run regression tests prior to source code commit, and more fragile as GUI implementation changed. We had to re-factor many tests from selenium to Test::Unit functional or integration tests to improve test performance and reliability.

The key to successful continuous integration is to test continuously, either with TDD practices or TIA (test immediately afterward), as part of accepting stories. This includes all code implemented for the story and any additional tests to cover the acceptance criteria. Testing at the lowest level feasible for code coverage is important for test efficiency. This may require the creation or test fixtures, mocking response expectations, and data factories.

Testing should include both happy path and negative tests (exception handling). Development Engineers need to have a sense of ownership for regression tests. Quality Engineers need to have a sense of test coverage completeness. Together the scrum team needs to hold themselves and each other accountable for not leaving test coverage technical debt beyond the story acceptance. Plan this test engineering time into the story. It may mean that your velocity is a little less than not doing this, but the overall sustainable stride is greater and you really do catch problems prior to (or at the time of) committing changes.

There is a place for automating at the GUI with Selenium or a comparable HTML element or application control automation tool. By limiting the use of these tools to testing AJAX or behavior that requires interactive javascript to render the page, workflows between systems, use cases (data driven), and cross-browser testing.

What we found is that ideally…

1. Test Engineers embedded in a development scrum team should have the ability to:

  • read and exercise application code
  • author unit test cases
  • create and work with test fixtures, test mocks and test data factories
  • assess adequate test coverage for development stories

2. Test Engineers chartered with testing external to the development teams should be able to:

  • deploy to fleets (fully automated is preferred)
  • read and understand mocked interfaces (to exercise actual interfaces)
  • author and exercise run-time tests (cover GUI and API workflows across the system)
  • author and exercise performance/load tests

Continuous Integration assures a solid application code base with full test coverage. It engages all engineers in responsibility for application testing. It allows dedicated Test Engineers to focus on system-level functionality, deployment, load, and user experience.

%d bloggers like this: