Sunday, December 23, 2007

CAT - Customer Acceptance Testing

Acceptance Testing is often seen as marking the change in a system’s ownership from the developers of the system to those who will use the system.It is a distinct phase of testing prior to final sign-off and delivery of a system to the customers or business users.The responsibility for acceptance testing may reside with the customer (CAT - Customer Acceptance Testing) and/or end users (UAT) themselves.
The primary objectives of preceding test phases (unit, integration and systems testing) are those of defect identification and ensuring that each step in the process of building the software delivers working products (verification).
The objective of Acceptance Testing is to give confidence that the software being developed, or changed, satisfies the specified requirements, meets customer/user expectations and is fit for business purpose (validation). Unlike the other test phases, an objective of acceptance testing is not to actively look for faults. The expectation should be that the preceding stages of testing have identified and resolved software faults and that the system is ready for operational use.
Clear entry criteria may need to be put in place to ensure that the expectation of a system being ‘production ready’ has been met.This is an opportunity to review the effectiveness, completeness and outcome of previous test phases and to declare any known problems that remain prior to acceptance test commencing.
As a testing phase with it’s own objectives, acceptance should not seek to duplicate other test phases.However, it may repeat previously run test scenarios to provide confidence that preceding test phases have been completed as appropriate. Those responsible for acceptance testing should dictate the objectives and coverage of the test phase. In doing so they are likely to consider the following:
  • Have the acceptance test conditions been demonstrated?
  • Do the test conditions model realistic business scenarios?
  • Is there appropriate coverage of each business function?
  • Is there appropriate coverage of system reports and screen?
  • Is there appropriate coverage of application menu options?
  • Are updated training materials and system documentation subject to test?
It is important to minimise the number of changes taking place to the system during the acceptance test phase.The cost of reworking, to change software or system operation, at this stage of development is high.Implemented changes may invalidate testing that has been already conducted and require greater levels of regression testing to be carried out.
Where faults are uncovered during acceptance, temporary fixes should first be investigated before attempting to make changes to the software.However, it may be necessary to agree the correction of high severity/priority problems and to defer/timetable the correction of others before acceptance can occur.
In order to minimise the risk of software faults and system changes, acceptance should not just start at software delivery. It needs to be a function of the earliest design and development stages. Early involvement of users in the system requirements and design stages can minimise the risk of gaps and misinterpretations occurring in end system functionality.The benefits reaped at later stages can be the avoidance of re-work and the reduced cost of system delivery.

Saturday, December 01, 2007

Acceptance testing definitions

From Search CIO

Wikipedia describes acceptance testing as:

"... is black-box testing performed on a system (e.g. software, lots of manufactured mechanical parts, or batches of chemical products) prior to its delivery...

"In most environments, acceptance testing by the system provider is distinguished from acceptance testing by the customer (the user or client) prior to accepting transfer of ownership...

"A principal purpose of acceptance testing is that, once completed successfully, and provided certain additional (contractually agreed) acceptance criteria are met, the sponsors will then sign off on the system as satisfying the contract (previously agreed between sponsor and manufacturer), and deliver final payment."

This is consistent with the definition Cem Kaner uses throughout his books, courses, articles and talks, which collectively are some of the most highly referenced software testing material in the industry. The following definition is from his Black Box Software Testing course:

"Acceptance testing is done to check whether the customer should pay for the product. Usually acceptance testing is done as black box testing."

Another extremely well-referenced source of software testing terms is the International Software Testing Qualifications Board (ISTQB) Standard glossary of terms. Below is the definition from Version 1.3 (dd. May, 31, 2007):

"Formal testing with respect to user needs, requirements, and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user, customers or other authorised entity to determine whether or not to accept the system."

I've chosen those three references because I've found that if Wikipedia, Cem Kaner and the ISTQB are on the same page related to a term or the definition of a concept, then the testing community at large will tend to use those terms in a manner that is consistent with these resources. Acceptance testing, however, is an exception.

There are several key points on which these definitions/descriptions agree:

  1. In each case, acceptance testing is done to determine whether the application or product is acceptable to someone or some organisation and/or if the person or organisation should pay for the application or product AS IS.
  2. "Finding defects" is not so much as mentioned in any of those definitions/descriptions, but each implies that defects jeopardise whether the application or product becomes "accepted."
  3. Pre-determined, explicitly stated, mutually agreed-upon criteria (between the creator of the application or product and the person or group that is paying for or otherwise commissioned the software) are the basis for non-acceptance.

    Wikipedia refers to this agreement as a contract, and identifies that non-compliance with the terms of the contract as a reason for non-payment. And Kaner references the contract through the question of whether the customer should pay for the product.

    The ISTQB does not directly refer to either contract or payment but certainly implies the existence of a contract (an agreement between two or more parties for the doing or not doing of something specified).
User Acceptance Test UAT

Saturday, November 17, 2007

Customer acceptance testing

This is advice on acceptance testing from an article here.

Acceptance Testing


Seventy percent of all software projects fail outright. When considering the other 30% of "successful" projects, you find many cases where the organisations involved merely declared victory and shut everything down.

But why do projects fail? Missing, miscommunicated, or misunderstood requirements are by far the biggest reasons. Hopefully I've given you some tools to help make these problems go away.

A close second to all this? Lack of end-user acceptance. Why? Because rarely are front-line troops—the ones who will use your application—ever invited to stakeholder meetings. They never get to give any input. Their managers or outside consultants are the only ones you ever meet. Unfortunately, these people don't have the kind of visibility actual users of your system have.

Ignore the end user at your own peril! Most of the time, the stakeholders you're dealing with don't know you need to talk to end users at all. You're the expert, not them! Even if they present you with reams of documents in which they've covered every conceivable request from end users, request a tour of the office or plant floor. Meet some end users. Take them out to lunch. Talk to them early, see what's bugging them.

Spend an afternoon just hanging out watching what happens. How do their processes get them through the day? Where do their processes get in the way? Where do they put in short cuts to avoid bureaucratic nightmares? Does person A from this department do small favours for person B in finance to expedite work orders?

If you ignore the end user at the beginning of the process, if you don't talk to them upstream, you certainly will be talking to them at the end of the process, when you deliver the final application for acceptance testing and/or training.

Guess what? Those stakeholders who didn't give you access to the end users early on will be asking those end users at the end of the project what they think of the new application, and you'll be judged by what they say. Fair? No. But you didn't ask, and they didn't have that kind of visibility into your expertise.

Remember, no one can verify you're on the right track more than an end-user who is an expert on a particular procedure or workflow.

Customer acceptance testing UK

Saturday, November 03, 2007

Guidelines on acceptance testing

From GCN.com

The Election Assistance Commission has issued six new sets of guidelines for election officials to help them with the chores of managing local elections.

The guidelines, several of which address technical issues, are part of a series of Quick Start Management Guides being produced by the commission. They are available online and also are being distributed as brochures to election officials. The commission plans to issue two new Quick Start brochures by the end of the year on military and overseas voting, and developing an audit trail.

The guide on acceptance testing explains what an acceptance test is and covers how to conduct a test for optical-scan ballot scanners, touch-screen voting stations and election management systems. A guide for contingency and disaster planning includes information on developing a management plan to address emergency and crisis response. It also covers developing a continuity-of-operations plan for elections and contingency planning for countywide events and internal operations.

Other guidelines cover absentee voting and voting by mail, managing change in an election office, media and public relations, and selection and preparation of polling places and vote centers.

To request copies or for more information on Quick Start Management Guides, call EAC at (866) 747-1471 or e-mail HAVAinfo@eac.gov. For information on EAC's voting system certification program, including test labs, registered voting system manufacturers, voting systems that have been submitted for testing, test plans, notices of clarification and other program-related information, visit http://www.eac.gov.

Saturday, October 27, 2007

Scots election review critical of user acceptance testing

Extract from an ITPro article

The review was highly critical of the fact that "this technical problem had not occurred during the testing activities undertaken by DRS or by the Project Board".

It found there were a number of small differences in the configuration of the system for the test and for the actual count that were significant to its operational ability to scale.

For example, the most significant of these differences was that the test configuration only had information relating to the set-up at a single count centre, with 50 entries in the 'contest' table, which related to the total number of wards and constituencies to be counted each centre. But the final configuration used at each count centre contained information relating to all count centres for resilience purposes, which resulted in about 550 entries in the 'contest' table.

The Commission also said the 32 Returning Officers individually responsible for election administration in their constituencies found themselves "swimming upstream" against the introduction of the new technology.

And it said the appointment of one contractor to provide the electronic counting system, using the same technology, equipment and software across the whole of Scotland made it difficult for regionally dispersed Returning Officers to take effective remedial action when problems occurred.

Contractor DRS welcomed the publication of the review. It said it fully supported "Gould's proposed review of legislation and political involvement to ensure that e-counting technology is properly integrated into the electoral process".

DRS also told IT PRO that, following the May elections, DRS have completed a full evaluation of its e-counting systems performance and of the issue experienced.

It has subsequently extended its software to include new automated features, like updating of statistics prior to the start of each count; the rebuild of indexes prior to each count starting; and the recompiling of stored procedures prior to each count starting. User level monitoring and early warning of database performance issues arising during the count, with user level capability for corrective actions, have also been added.

The report also called for legislative and policy frameworks that guide electronic counting and better, as well as more comprehensive pilot phases and user acceptance testing.

Sunday, September 16, 2007

Traffic acceptance test

From PR newswire

INRIX, the leading provider of traffic flow information in the country, and Short Elliott Hendrickson Inc. (SEH(R)), a Midwest-based professional consulting services firm, today announced the system acceptance by the Wisconsin Department of Transportation (WisDOT) for INRIX to provide real time traffic flow data for nearly 250 centerline miles of US 41, I-43, and WIS 172. This milestone makes INRIX the first company to provide operational vehicle-probe based traffic data services to a government agency.
The process of procuring privately sourced GPS and/or cellular probe-based traffic data began approximately one year ago in order for the WisDOT to assess the utility and cost-effectiveness of such alternative traffic data sources in support, or in place of, traditional ITS data collection. INRIX was able to fully meet the strict project requirements regarding data quality, precision and availability which will allow the successful integration of this information within WisDOT's TOC.
"We are proud that INRIX and SEH, working closely with Wisconsin DOT and its team, have been able to verify satisfaction of the requirements set forth in the original RFP," said Rick Schuman, INRIX vice president, public sector. "It is a testimony to the commitment and quality of our team, as well as that of WisDOT and its support resources, to move from kick-off to approval in roughly six months, which included nearly two months of extensive acceptance testing. We are pleased to be the first company to provide operationally and under contract vehicle-probe based intercity traffic data services to a government agency, and the first to deliver a project of this scope within the original schedule."
Until now, these routes lacked traditional road sensors. With the INRIX Smart Dust Network, traffic information is now available by leveraging data from more than 250 public and private sources, including the world's largest network of GPS-enabled probe vehicles, and other data sources. INRIX's unique Traffic Fusion Engine uses advanced, proprietary statistical analysis to aggregate, blend and enhance traffic-related information, ensuring highly accurate road speeds and the broadest market and road coverage available. For the first time, commuters in Wisconsin will have access to INRIX real time traffic information that, in addition to WisDOT resources, is available on over 75 models of portable and automotive navigation devices, mobile phones and smartphones.
Jim Hanson, PE, PTOE, principal, transportation services for SEH, said, "Providing accurate travel information helps keep our roadways and our travelers safe, and INRIX has provided the means to achieve real time traffic data to help accomplish this goal. The acceptance test results verified our strong belief from the initial proposal phase that INRIX data would satisfy the rigorous accuracy, timeliness and reliability requirements demanded by WisDOT, even outside of urban areas."

Saturday, August 11, 2007

Production flight acceptance testing

From United Press Internationals

BAE Systems have announced details of its Block 5 Retrofit 2 upgrade program for the British Royal Air Force's Eurofighter.

"The maiden flight of the first Block 5 production aircraft has already demonstrated the benefits of continued improvements on the Typhoon development program," BAE Systems said.

“The aircraft performed exceptionally well, achieving a high level of performance. This production flight acceptance testing flight was one of the best ever, achieving 90 percent of the planned schedule,” said Martin Topping, Typhoon final assembly operations manager.

"The Block 5 standard gives Typhoon full air-to-air and initial air-to-ground capability with full carefree handling," BAE Systems said.

Saturday, July 21, 2007

Top engineering firm to provide acceptance testing

Manufacturing computer reports that BP’s PTA plant expansion in China is to be automated using the Emerson PlantWeb digital architecture.

Emerson is to engineer and manage the automation project, using Foundation fieldbus technology for process control and data management, as well as OPC (OLE for Process Control) technology.

As main instrument vendor, Emerson will be responsible for detailed design, including engineering design and implementation.

“We are pleased at this new opportunity to combine our technology with the solutions skill of our Chemical Centre of Excellence and the engineering supervision and project management leadership in our Singapore office,” says John Berra, president, Emerson Process Management.

“We are encouraged by BP’s confidence in our ability to provide system design, configuration, staging, acceptance testing, commissioning, and start-up support. We are committed to delivering a world class digital automation system.”

Saturday, June 30, 2007

Insurance book conversion and acceptance testing

This article from Insurance Networking looks at converting a book of business to a new system.

This can be viewed by some as the riskiest part of a new system implementation. However, with proper time and resources, migrations need not be feared. They are all part of the system migration, which is to enable the company to improve its ROI and competitiveness. Each company needs to assess the cost, time and benefits, and if sufficient resources and time are allocated, success is manageable.

In single-phase conversions, all data is converted at one time with two options. Sufficient history levels (number of years) are converted that enables the discontinuance of the existing system. And the latest versions of data are converted to allow the new system to be used for all future transactions. The existing system is retained for a period of time for inquiry purposes into past transactions.

The advantages of a single-phase conversion are that the required resources are used for the lowest amount of time and therefore costs. Longer conversions run the risk of losing some resources, which need to be replaced, leading to a lengthier task.

In a multi-phase conversion, data is segmented by line of business or territory. The initial phase will encompass the installation of the whole system but will include only a portion of the portfolio. This will include underwriting, billing, claims, reinsurance, document issuance, management reporting, financial reporting and bureau filing. Each module will be tested using a portion of the portfolio. The initial phase is comparable to a single-phase implementation as the whole system is tested.

Subsequent phases involve adding other parts of the portfolio. This mostly affects the underwriting system to ensure proper rating, document issuance etc. These are much smaller phases and can be achieved with fewer resources. The portfolio being added needs to be tested through all modules (billing, claims, reporting, etc.). However as these modules have already been tested the additional work relates to tracing the new transactions through the system to ensure that they are being processed correctly.

The multi-phase approach can be implemented where each phase is implemented, user acceptance testing completed and then goes live as they are completed. Alternatively, each phase is completed up to the user acceptance testing, and only after all parts of the portfolio have been completed is the system put into live production.

Sunday, June 03, 2007

UAT from Princeton

Some advice from Princeton on User acceptance testing.

"In this phase, the system is fully tested by the client community against the requirements defined in the analysis and design stages, corrections are made as required, and the Production system is built. "

It contains an outline of the deliverables (along with templates of some of them such as the User acceptance sign off template. It also contains a 10 step approach for user acceptance testing (UAT). It is part of the overall Princeton development method for constructing information technology systems and applications.

Saturday, March 31, 2007

UAT and how IT works

From Find out how “IT” works.

In this case, “IT” is not limited to information technology, but whatever “it” is – the requirements process, the project management process, the testing process, etc. The better you as a customer understand “it,” the more effective you will be in making “it” work for you.

Donna Corrigan from CoBank suggests: “When defining requirements, stay focused on value-added business requirements, not technology. Understand the user acceptance testing (UAT) schedule early on, and clear the calendars of UAT business staff during the UAT period.”

Bill Yock from The University of Washington says: "The most important thing a customer of a BI project should insist on is "World Class" training. Incorporate training on both the meaning of the data and the use of the business intelligence tools. Many BI projects invest too much in the ‘back room’ data loading and integration of warehouses and marts, and not enough in the ‘front door’ … educating and informing users on how to interpret, analyze and understand their information. One of our mission statements was ‘Free the Data.’ Unfortunately, once the data was free, nobody understood it. This led to lack of confidence in the ‘numbers’ and adoption of the BI tool hit a brick wall."

To be a good BI customer, figure out what “it” is and how you can proactively influence and invest time in requirements, data models, prioritization, governance, testing, training, etc. Don’t allow yourself to be so dependent that you passively sit back and wait for “IT” to make everything happen.

Saturday, March 24, 2007

Testing contactless cards

From Contactless News

Taco Bell has become the latest national fast food chain to enter the contactless arena. It will be testing Visa contactless cards at 100 locations later this year with an eye towards implementing the technology at all 1,300 locations in 2008. The credit card giant also announced that a movie theater, a northeast supermarket operation and an ice cream chain are now accepting Visa contactless cards.

Visa USA announced that four significant national and regional merchants are the latest to accept Visa cards with embedded contactless payment technology, allowing consumers to use the convenience of this innovative technology to buy food, groceries and movie tickets at more than 3,000 new acceptance locations.

The four companies – Braum's Ice Cream and Dairy Stores; Clearview Cinemas; Wakefern Food Corporation, the merchandising and distribution arm of ShopRite supermarkets; and Taco Bell Corp. – are emblematic of the kind of merchants who have helped to make Visa Contactless cards the most rapidly adopted new payment technology in Visa’s history.

Visa Contactless technology enables customers to make purchases by simply holding their Visa Contactless cards near a secure reader at checkout instead of swiping them. The account information is securely transmitted by a tiny radio frequency chip embedded in the card.

Visa Contactless eliminates the signature requirement for most payments under $25, providing even greater convenience to customers making small ticket purchases. In addition, Visa Contactless maintains the level of security the industry has come to expect from Visa. A Visa card with the contactless feature never leaves the customer’s hand and can only be read at extremely close proximity to a contactless reader. In addition, the chip embedded in Visa Contactless cards generates data that is unique to each transaction, ensuring the same data cannot be used for fraudulent contactless transaction.

Testing Acceptance User

Saturday, February 24, 2007

Business Acceptance Testing

Here is a link to an article on Business Acceptance Testing (BAT) at Tech Republic.
It covers the Acceptance Test Plan and identifies the key elements as:

  • Customer responsibilities: Describe the activities the customer is responsible for and what others are responsible for. On some projects, the customer is responsible for all aspects of the test, including creating test scenarios, performing the tests, and validating the results. On other projects, the entire teams may assist in the various activities.
  • Acceptance criteria: Before you begin the acceptance test, the team should know what criteria will be used to decide whether the system is acceptable or not. Does the system have to be perfect? You better hope not. But the acceptance criteria should define how the decision will be made. The customer may accept a system with certain minor types of errors remaining. But there may be other levels of errors that will render the system unacceptable. Part of the acceptance criteria may be to revalidate some of the other system tests. For instance, the customer may want to thoroughly test security, response times, functionality, etc. even if some of these tests were done as a part of system testing.
  • Acceptance Test Workplan: Here, you define the activities associated with the acceptance test, when they will begin and end, who is responsible, and so on. In other words, you define the workplan for the test. Defining all this up front will help people understand what is expected of them, and they'll be aware of the timeframe and know when things are due. You should move all this information to the main project workplan.
Business Acceptance Testing

Wednesday, January 24, 2007

Quality movement: has it lost its way?

This post is slightly different from the usual posts on UAT in that it looks at a close relation: Quality Management. This is an extract from a report on the Quality movement by A. Kumar and puts a case to re-evaluate it:

We can have perfect systems and procedures for quality, but if the employees are not willing to follow, the whole system will fail to deliver. All companies in the services sector are victims of this syndrome. When the quality movement started, some shady agencies deceived their clients by promising that a quality assurance system, a quality certification and a quality assurance manager can ensure quality of the output. A QA system is as good or bad as the quality of those who implement it, and no company with sub-standard employees can provide standard services even if there is an ISO certified QA system and an expert QA Manager in place.

The most significant development in the rush for quality has been the unprecedented growth in the number of certification agencies. From a few international players the number has grown in hundreds the world over. The initial few had a bountiful harvest when the rush started, and that provided an ugly model for numerous "operators" to try their hand at a low-investment-high-return business opportunity. The simple fact that most of the new certification agencies have their principals in Europe and it was the European Union that triggered the quality juggernaut makes the whole exercise suspect.

The biggest beneficiary of this erroneous concept of quality assurance has been a set of professionals masquerading as quality assurance managers (QAM) in ISO-certified companies. They first descend on the victims (desperate for ISO certification) in the form of experts or consultants willing to assist in getting the initial certification. It goes to their credit that they do a useful job at this stage. In most companies, there will be severe shortage of expertise to document the procedures they follow in a systematic manner. This vacuum is easily filled by the so-called ISO experts and consultants. After marathon sessions with the employees, an acceptable and auditable quality system is set up. Then these quality consultants (sometimes in the form of a full-fledged company) spend months upon months conducting mock audits, feedbacks and corrections, until finally the whole system is ready for offering to an accredited external agency for certification. It is always up to these ISO consultants to conclude their deal by "arranging" a smooth audit and certification as per ISO norms.

The dubious and redundant aspect of quality management starts from this point. In order to maintain the certification and withstand the periodic surveillance audits of the external agency, most companies are too willing to accommodate these consultants or experts in the form of a quality assurance manager. Guidelines of ISO have made it easier for these "redundant" managers to report directly to the chief executive in order to keep "quality" as a high level and independent function in the organisation. And that makes it easier for these climbers to stay on forever. Most of the time, their work is reduced to arranging "facilities" for the external auditors and "taking care" of them during the audits. That both these efforts do not necessarily contribute to the quality of the host organization is a simple fact that misses the attention of the chief executives. In fact, they are counterproductive and retrograde. In the long run almost all such companies end up uneconomical, thanks to their misplaced overemphasis on quality rather than profit. Quality is meaningful only in a profit-making company and all the efforts for it must subordinate the ultimate objective of making profit.