Thursday, August 31, 2006

UAT outline description

From JISC Infonet

User Acceptance Testing is a key feature of projects to implement new systems or processes. It is the formal means by which we ensure that the new system or process does actually meet the essential user requirements. Each module to be implemented will be subject to one or more User Acceptance Tests (UAT) before being ‘signed off’ as meeting user needs. The following overview answers some of the main questions that have been asked about UATs.

A User Acceptance Test is:
  • A chance to completely test business processes and software
  • A scaled-down or condensed version of a system
  • The final UAT for each system module is the last chance to perform the above in a test situation
The scope of each User Acceptance Test will vary depending on which business process is
being tested. In general however, tests will cover the following broad areas:
  • A number of defined test cases using quality data to validate end-to-end business processes
  • A comparison of actual test results against expected results
  • A meeting/discussion forum to evaluate the process and facilitate issue resolution
Objectives of the User Acceptance Test are for a group of key users to:
  • Validate system set-up for transactions and user access
  • Confirm use of system in performing business processes
  • Verify performance on business critical functions
  • • Confirm integrity of converted and additional data, for example values that appear in a look-up table
  • •Assess and sign off go-live readiness
User acceptance test UAT

Sunday, August 27, 2006

Acceptance testing of new MMORPG

From MMORPG blog

"Developer Persistent Worlds was very pleased to announce that Carpe Diem, its newest fantasy MMORPG offering, has entered its Alpha Stage today.

Alpha Testing, otherwise known as Acceptance Testing, is wherein a new product in pre-release is tested internally before testing it with outside users. Says Persistent Worlds' Director of Technology and Operations, Phil White, "The Alpha test will allow us to ensure that all servers, support and game play changes are working properly before moving to the next stage."

The Alpha Test appears to be going well so far, and they expect to go to closed beta in about two weeks. Already being highly-anticipated, Carpe Diem already has received heaps of requests from gamers who want to take part in the game's Beta test. The positive response from the beta test survey also contributes to the optimistic outlook of the game's developer."

Alpha testing services for software applications

Sunday, July 30, 2006

User acceptance features added to Mercury software testing tools

Mercury Interactive Corp. (although perhaps we should be calling them HP) have released a new version of its Business Process Testing QA (quality assurance) tool, with additional user-acceptance features.
In this article from PC Welt they desribe the new features added to BPT part of Quality Center which include tighter integration with the automated functional testing tool Winrunner. The part we are interested in is the certification step added to the software testing process whcih covers user acceptance.
It features a Web interface that business executives to try out a new aoftware application. This is meant to automate a step that is usually performed in an ad-hoc manner.


Endurance testing

Saturday, July 22, 2006

Definition: UAT

This definition of User Acceptance Testing comes from Usability First:

"UAT; a method for determining how well users have adopted a new technology, especially in organizational settings. Users are typically interviewed to determine if and how they are using the technology and to understand what barriers to adoption may exist.

This sometimes refers to a "user testing" technique where users are asked to perform a scripted list of tasks with every step outlined and to comment on usability problems they encounter. However, this is more a quality assurance technique than user testing because scripted tasks don't reveal problems that users would encounter in actual use."


Software acceptance testing

Sunday, July 02, 2006

Acceptance tests for extreme programming developments

Here's a page which describes acceptance testing for extreme programming developments. The tests are created from user stories (which equate to use cases).

"Quality assurance (QA) is an essential part of the XP process. On some projects QA is done by a separate group, while on others QA will be an integrated into the development team itself. In either case XP requires development to have much closer relationship with QA.
Acceptance tests should be automated so they can be run often. The acceptance test score is published to the team. It is the team's responsibility to schedule time each iteration to fix any failed tests.
The name acceptance tests was changed from functional tests. This better reflects the intent, which is to guarantee that a customers requirements have been met and the system is acceptable."


Independent acceptance testing services for agile developments

Wednesday, June 28, 2006

User Acceptance Testing key deliverables

An article here which attempts to explain the process of user acceptance testing. It concludes by listing the key deliverables of UAT as:

"1) The Test Plan- This outlines the Testing Strategy

2) The UAT Test cases – The Test cases help the team to effectively test the application

3) The Test Log – This is a log of all the test cases executed and the actual results.

4) User Sign Off – This indicates that the customer finds the product delivered to their satisfaction"

Tuesday, June 27, 2006

User Acceptance Testing Is Hardly Testing

Provactively entitled blog post here. At first glance it looks like something belittling user acceptance testing.

"Quality assurance personnel use performance and reliability testing to verify service levels of the system as a whole. While each of these techniques serves a unique purpose on the project, they also share a common goal - find defects. However, this is something your user acceptance tests should never do."

However, when you read the well written item, it is actually a plea to improve testing early in the lifecycle:

"Numerous studies have shown that fixing defects found late in a delivery cycle can carry an extremely high cost. If serious problems are found during User Acceptance Testing (or UAT), they can easily translate into significant budget overruns, slips of the schedule, or even project cancellation. Issues must be identified and addressed as early as possible. By the time UAT rolls around, there should be nothing major left to uncover. User acceptance testing should be little more than a pat on the back of your development team for a job well done."

I like the idea of UAT being a pat on the back very much.

Saturday, June 24, 2006

User acceptance test standards

A user acceptance testing standard for application development (including templates for the creation of a user acceptance test plan) can be found here.

The contents of the document are:

1. Introduction
1.1 Audience
1.2 Purpose
1.3 Assumptions
1.4 Standard

2. User acceptance test standards
2.1 Overview
2.2 New System Test
2.3 Regression Test
2.4 Limited Testing
2.4.1 Form-Based Testing
2.4.2 Business Process Testing
2.4.3 Report Testing

3. Assessing the test types for a test plan
3.1 Overview
3.2 Assessing the Test Type Required

4. Creating a user acceptance test plan
4.1 Test Instructions
4.2 Form-Based Test Scripts
4.3 User Security Matrix
4.4 Business Process Test Scripts
4.5 Report Test Scripts
4.6 Defect Tracking Form
4.7 Defect Tracking Log

5. Conclusion

And the conclusion is:

"The test scripts created for a new system, for regression testing or for limited testing should be recycled throughout the application 's life. By editing existing test scripts the User Acceptance Test Planning process can save time and money, as well as maintain the test quality, as key requirement information will be re-used."


Acceptance test planning support

Thursday, June 22, 2006

Non recovery V. Disaster Recovery

Interesting news story from Zdnet in Australia

"The NSW Department of Primary Industries (NSW DPI) has conceded its technology consolidation project is still suffering delays due partly to an incident which saw much of the computer equipment in its facility in regional NSW "slow-baked" in searing temperatures.

Warwick Lill, host systems manager, NSW DPI, said a faulty fire detection system at the start of the year had contributed to the agency's data centre reference implementation (DCRI) project, an 18-month long undertaking, running several months behind schedule.

The problem occurred at NSW DPI's site at Maitland in the Hunter Valley.

"At about 10 minutes to midnight on New Years Eve there was a false alarm in the fire detection system in Maitland, which shut down the air conditioning in the computer room without shutting down the computers," Lill told attendees at Gartner's data centre conference in Sydney.

The state experienced record temperatures on new year's day.

"This wasn't detected until about 11 o'clock in the morning," he said.

"By that time the temperature inside the computer room was up around 70 degrees or something.

"That sort of slow-baked all of the equipment that was in the room, and we're in essence still recovering from that one. So work at the Maitland site in regards to this project has come to a stop until we manage to fix that."

The DCRI project aims to consolidate a large swathe of information technology operations inherited from the NSW DPI's predecessor agencies to sites in Maitland and Orange under moves to generate efficiencies and cut costs. The NSW government created the DPI in July 2004 through the amalgamation of Mineral Resources NSW, NSW Agriculture, NSW Fisheries and State Forests NSW.

"We had too many IT managers, we had too many systems, we had too many ways of doing things," said Lill.

The DCRI project has required extensive planning and preparation in areas such as disaster recovery and system layout.

"We wanted to setup a mutual failover between those two sites, that's the basis of our disaster recovery plan," said Lill.

"We wanted to rationalise the system landscape that we had, so we had dedicated systems for production, user acceptance testing, development, DR [disaster recovery] and so on."

NSW DPI has also acquired a range of hardware and software for the project, primarily from services partner Sun.

"What we've put in, in order of implementation, was a management cluster with dedicated storage at each of our two major sites."

This project was not without its problems, however.

"We moved a little too quickly from the planning phase into the implementation phase, and that combined with the fact that we didn't know quite as much about clustering as we might've needed to do, we didn't quite get the design for the original management cluster right.

"It took a period of time to recognise that and to agree to redo it, so that's put us behind by about three months."

Also implemented were new tape libraries at both sites, and Sun's C2MS (e-mail archiving), SAM-FS (archive management) and Management Centre (system management) software.

The choice of EMC Legato NetWorker as enterprise backup product, however, had been another contributor to delays in the project.

"We've found that the Legato backup software, the database agents for Legato backup software, are not certified for operation in Solaris 10 containers. So at the moment we can't back up our databases if we move them to the application cluster.

"So we're looking forward to a solution to that one in the reasonably near future," he said.

Lill described the work of services partner Sun as "terrific," but conceded there had been early problems in the relationship between the two.

"We bought a design I believe ... whereas Sun sold us a set of hardware and software and professional services during the implementation.

"Initially at least there was a bit of a disconnect between those two, but through communication and influencing and so on we've managed to bring things pretty well together.

"If I was doing it again I think I'd spend more time in the contract negotiation stage looking at the very fine detail of what was proposed."

Lill said he was confident the project would be completed and work well."

Interesting that if he was doing it again the key thing he'd concentrate on is the specification of what he was getting. Having an effective acceptance process and clear acceptance criteria is a key step in getting what you expect when you expect it.

Acceptance criteria and testing


Tuesday, June 20, 2006

Acceptance testing article

Here is an article on acceptance testing and its role in allowing customers to ensure that the system meets their business requirements.

A couple of useful passages:

"In fact, depending on how your other tests were performed, this final test may not always be necessary. If the customers and users participated in system tests, such as requirements testing and usability testing, they may not need to perform a formal acceptance test. However, this additional test will probably be required for the customer to give final approval for the system."

Combining user acceptance testing with other testing activities saves time and money.

"The acceptance test is the last opportunity customers have to make sure that the system is what they asked for. When this final test is complete, the team expects that the customer will formally approve the system or point out any problems that still need to be resolved. Therefore, unlike all the other tests performed so far, acceptance testing is the customers' responsibility. Of course, unless the customers are very savvy in testing techniques, they will still need the participation of the IT team."

Using specialist software testing support ensures your testing is effective and efficient. Given that this is the final test it needs to be effective.

Sunday, June 18, 2006

User acceptance testing term

This definition of user acecptance testing comes from SearchSMB.com:
In software development, user acceptance testing (UAT) - also called beta testing, application testing, and end user testing - is a phase of software development in which the software is tested in the "real world" by the intended audience. UAT can be done by in-house testing in which volunteers or paid test subjects use the software or, more typically for widely-distributed software, by making the test version available for downloading and free trial over the Web. The experiences of the early users are forwarded back to the developers who make final changes before releasing the software commercially.

UAT services

Saturday, June 17, 2006

User acceptance testing

This blog is about User Acceptance Testing (aka UAT or just simply acceptance testing). For this first post I will simply offer the Wikipedia defintion:

User Acceptance Testing (UAT) is a process to obtain confirmation by a Subject Matter Expert (SME), preferably the owner or client of the object under test, through trial or review, that the modification or addition meets mutually agreed-upon requirements. In software development, UAT is one of the final stages of a project and will often occur before a client or customer accepts a new system.



User acceptance testing services