A look inside the design and implementation of one of the first virtual QA labs, policies, practices, experiences and procedures.
Tuesday, April 1, 2008
LEAN Software Quality Assurance
Friday, February 29, 2008
Agile and Free to Learn Tools in the Dot Net Arena
Agile development and test have finally arrived in the .dotNet programming environment, just as the open source movement has entered into the commercial development of Microsoft and some other top tier software organizations. Typically, dotNet has been solely a development tool with very few quality tools to speak of in any real terms. Quality teams have been almost isolated and collaboration between development and quality assurance are tenuous at best, which is quite the standard in traditional development environments. There is now an opportunity emerging with the launch of Visual Studio 8.0 to a more agile and cooperative team environment.
Development tools for building both desktop and team-based enterprise Web applications are standard for this and many other IDE’s on the market. What this release has to offer is providing the testers the ability to collaborate and fine tune those applications in real and measurable terms. Testers have available to them a complete studio of tools in the Test Edition of Visual Studio 8.0 that are integrated into the development environment including unit, Web, load, manual and code coverage tests. In the past, the use of the Web Application Stress Tool was the only opportunity in this space for configurable load and performance testing of applications. While this tool was effective, it did take a fair amount of expertise in the tool itself as well as load testing in general. This was time consuming and did not lend itself to rapid development and as such, was relegated to the luxury bin and rarely performed well, if at all. Much of the guesswork and pain of learning a new tool is alleviated in a familiar interface and standard toolbars in this tool.
The challenges facing quality assurance in the testing of web based applications has been addressed with the advent of the WebTesting namespace that provides specific classes to enable Web testing. The base class for all web tests is the WebTest class, which is available out-of-the-box as are the classes WebTestRequest and WebTestResponse classes for simulating HTTP requests and responses. Gone are the days of hand coding tests for http responses and posts, this is now a point and click operation with an expert view if you feel so inclined.
Building and testing high-performing desktop applications in a simple team-based design enables swift and high quality applications and deployment of enterprise solutions. Add the ability to communicate and provide to the development team valid and reproducible errors seamlessly and you have the pure divinity of agile. The opportunity to learn the functionality of this tool are available at no cost on the educational portion of the developer network site called channel 8, also known as DreamSpark to students everywhere. You can download the tools below at no cost: Channel 8, DreamSpark, Test Center 90 day Trial Download.
Thursday, December 13, 2007
Configuration Status Accounting
- application software specification was released;
- allocated baseline was established;
- application software Test Plan for the CI was released;
- application software Test Plan for the CI was approved;
- application software design document was released;
- application software design document was approved;
- application software CI testing took place;
- application software test report for the CI was approved;
- application software Product Baseline was established.
- the date of the change request;
- the title of the change request;
- the status of the change request;
- the approval date of the change request; and
- the planned and actual dates of change implementation.
Software Configuration Managment: Configuration Identification of Software Configuration Items
- Business Need/Requirements Statement
- Software Requirements Specification
- Software Design Description
- Source Code/Executable Code
- Test Plan Procedure
- Software User Manual and Run Book
- Training Plan
- Software Product Specification
3. Each application software specification will be baselined (as part of the Allocated Baseline) and subject to formal change control.
4. Application software specifications should be traceable to the system-level requirements specification.
5. The content of the application software specification establishes the acceptance criteria for that software configuration item.
6. Each application software specification will be numbered with the configuration items identifier as part of the document number, for example, the specification for application software configuration item XYZ Project will be XYZ Project-SPEC.
7. The naming convention used applicable to application software configuration items is to have a maximum of the first six characters be the CI number; for example, XYZ Project-OPEN-WINDOW links the open window module to the XYZ Project CI.
Agile Testing Types and Process
Step 1: Unit testing
Performed by the Developer. Unit test case design begins after a technical review approves the high level design. The unit test cases shall be designed to test the validity of the program's correctness. In other words each decision statement in the program shall take on a true value.
Step 2: Test Planning
Performed by QA. Review requirement specifications, architectural designs and use cases to develop test plans, test steps and test scenarios. Identify any gaps or issues that need to be addressed.
Step 3: Requirements Traceability / Testability
Performed by QA. Traceability is the ability to show how requirements are derived from higher level (or "parent") requirements. Conversely, traceability helps to identify all downward requirements derived from parent requirements.
Step 4: Integration Testing
Performed by QA. Integration testing proves that all areas of the system interface with each other correctly and that there are no gaps in the data flow. The final integration test proves that the system works as an integrated unit when all the fixes are complete.
Step 5: Build Verification / Smoke Test
Performed by QA. When a build has met completion criteria and is ready to be tested, the QA team runs an initial battery of basic tests to verify the build. If the build is not testable at all, then the QA rejects the build. If portions of the build are testable those are tested and the results documented.
Step 6: Functional Testing
Performed by QA. Functional testing assures that each element of the application meets the functional requirements of the business as outlined in the requirements document/functional brief, system design specification, and other functional documents produced.
Step 7: Performance, Load and Stress Testing
Performed by QA. Non-functional testing proves that the documented performance standards or requirements are met. Examples of testable standards include response time and compatibility with specified browsers and operating systems.
Step 8: Defect Fix Validation
Performed by QA. If any known defects or issues existed during development, QA tests specifically in those areas to validate the fixes implemented by the developers who have also unit tested these fixes prior to sending them over to QA for validation.
Step 9: Regression Testing
Performed by QA. Regression testing is performed after the release of each phase to ensure that there is no impact on previously released software. Regression testing cannot be conducted on the initial build because the test cases are taken from defects found in previous build in initial there will be nothing to test against.
Step 10: Error Management
Performed by QA. During the QA testing workflow, all defects will be reported using the error management workflow. Regular meetings will take place between QA, development, product and project management to discuss defects, priority of defects, and fixes.
Step 11: QA Reporting and Readiness Review Performed by QA. QA states the results of testing, reports outstanding defects/known issues, and makes a recommendation for release into production.
Step 12: Release Management
Performed by QA. Releasing software refers to the process of providing some named (or otherwise uniquely identified) files to others for use. The others may be your department at work, your classmates, or The World. Managing the release means you know, understand and can explain what went into it.
Step 13: User Acceptance Testing
Performed by All concerned Product Managers, Project Managers and a Customer Focus Group (if applicable) to verify the new system, data and software changes meet customer expectations and meet usability requirements.
Step 14: Release into production
Performed by QA & Project Team If the project team determines that the build is acceptable for production, the configuration/version management team will migrate the build into staging for the implementation team to move into the production environment.
Step 15: Post Implementation
Testing Performed by QA. Testing performed after the software has been deployed to ensure proper implementation.