- Analytical skills: Ability to analyse and synthesize information to understand issues and identify options.
- Influencing and negotiation skills: Ability to promote ideas and persuasively; shape stakeholder opinions and negotiate skilfully to create the best outcome possible. Negotiate win-win solutions.
- Business understanding: Ability to know the business and the mission-critical technical and functional skills needed to do the job; understands various types of business propositions and understands how businesses operate in general; learns new methods and technologies easily. Understands the organization’s business model and competitive position in the marketplace. Understands potential for growth and profitability.
- Managing work processes: Ability to effectively design work flow, and systems; is good at figuring out what to measure to track progress; sets up systems that can almost manage themselves; is a master at the effectiveness and efficiency of work systems; can quickly diagnose and fix a work flow problem; always looking for incremental process improvement.
- Organising & Planning: Ability to plan, organize and prioritize work, balancing resources, skills, priorities and time-scales to achieve objectives.
- Project Management: Ability to bring projects to completion eliciting the collaborations of interdepartmental team members.
- Plan for contingencies: Ability to pro actively identify potential problems and create contingency plans or work-rounds to implement if problems occur.
- Industry Knowledge: Ability to know what it takes to be successful in this industry; have thorough knowledge of this industry’s history, customers, and competitive environment.
- Active Listening: Ability to give full attention to what other people are saying, taking time to understand the points being made, asking questions as appropriate, and not interrupting at inappropriate times.
- High Impact Delivery: Ability to deliver clear, convincing, and well-organized presentations; project credibility and poise even in highly visible, adversarial situations.
- Communicating Effectively: Ability to write and present effectively; adjusts to fit the audience and the message; strongly gets a message across.
- Detailed Oriented: Ability to be well organized and resourceful, has the ability to reduce a complex concept or task into something that is manageable and clearly interpreted. is able to get things done with less and in less time; can work on multiple tasks at once without losing track; foresees and plans around obstacles.
- Focusing on Action and Outcomes: Ability to attack everything with drive and energy with an eye on the bottom line; not afraid to initiate action before all the facts are known; drives to finish everything he/she starts.
- Keeping on Point: Ability to quickly separate the mission-critical from the nice to dos and the trivial; quickly senses what's the next most useful thing to work on; focuses on the critical few tasks that really add value and puts aside or delays the rest.
- Operates with Integrity – Ability to demonstrate honesty and behaves according to ethical principles; ensures that words and actions are consistent, behaves dependably across situations.
- Time Management: Ability to manage one's own time and the time of others.
- Motivating & Delegating: – Ability to manage people well; gets the most and best out of the people he/she has; sets and communicates guiding goals; measures accomplishments, holds people accountable, and gives useful feedback; delegates and develops; keeps people informed; provides coaching for today and for the future.
- Inspiring Others: Ability to get individuals, teams and an entire organization to perform at a higher level and to embrace change; negotiates skilfully to achieve a fair outcome or promote a common cause; communicates a compelling vision and is committed to what needs to be done; inspires others; builds motivated, high-performing teams; understands what motivates different people.
- Leads by Example: Ability to serve as a role model for the organization’s values; takes responsibility for delivering on commitment; gives proper credit to others; acknowledges own mistakes rather than blaming others.
- Leadership: Ability to use personal skills to guide and inspire individuals/groups towards achieving goals.
- Treats people fairly: Ability to treat all stakeholders with dignity, respect and fairness; listens to others without prejudging, objectively considers other people’s opinions.
- Judgement and Decision Making: Ability to consider the relative costs and benefits of potential actions to choose the most appropriate one.
- Team player: Ability to manage strong working relationships within the company. Able to complete tasks and work cooperatively with others.
- Adaptability: Ability to respond and adapt to changing circumstances and to manage, solve problems and provide solutions in a climate of ambiguity.
- Maximizes learning opportunities: Ability to actively participate in learning activities in a way that makes the most of the learning experience, such as taking notes, asking questions and critically analysing information; remains open to unplanned learning opportunities such as coaching from others.
- Applies new knowledge or skill: Ability to use new knowledge, understanding, or skill to practical use on the job.
- Flexibility: Ability to work in a fast paced environment.
- Computer skills: Possesses intermediate to advanced Microsoft Suite Knowledge (Word, Excel, PowerPoint) and Microsoft Project /Visio.
Friday, November 18, 2011
Wednesday, November 02, 2011
In recent times there has been a noticeable development in the body of knowledge around the concept of “ Automated Acceptance Testing” as described by Extreme Programming. In particular frameworks have begun to evolve that address the needs of automated acceptance-testing. However, most if not all of these frameworks are in their infancy. The concept behind this workshop is to draw on icipants experience to identify the features of such a framework were it to be fully mature i.e. the Ultimate Automated Acceptance Testing Framework.
Wednesday, October 26, 2011
There are 72,490 resources related to performance testing and amazingly there are 284,074 related resources to testing! Hope these are useful resources.
Wednesday, May 04, 2011
Good set really. I would say all four of them are contributed to by effective acceptance testing (especially the UAT one of course).
For requirements defintion it said "Strong requirements definition is a must, but requirements also must adapt. Business requirements are rarely the same at the end of a project as they were at the outset due to changes in the market, customers, regulations and corporate leadership."
This is true of aceptance testing requirements definition. People often forget that the business acceptance criteria change and that what may have been acceptable at the start of a programme is no longer sutiable even though it perfectly matching the acceptance criteria of the delivery project.
For the User acceptance it said "IT is about empowering people. IT tools won’t work if users don’t adopt them. It is important to involve the user community from any program’s earliest stages and to perform acceptance testing that is measurable and encourages open communication."
This is true for the user aceptance test. I have heard of user acceptance tests that involved no users it is not a desirable approach.
Sunday, April 24, 2011
Anyway the topic looked at the long term benefits of acceptance testing (and to a degree regression testing too). They were benefits such as more cost-effective support, better positioning for business change and lower dependency on individuals.
All these are good benefits. I am think that to some they may be unexpected. But to everyone that has a long term interest in acceptance testing it is preaching to the choir. Still I think that the more people make a noise about the benefits of acceptance test activities the more people will adopt good practices in this area.
Friday, April 22, 2011
In migration, you should consider several elements to form the questions you will use in the acceptance test. First you need to decide if the migration will be done while data is active or not. If you're planning to migrate while the data is not active, then you need to consider:
After the migration, compare the data to be sure it transferred accurately.
Are there pacing controls such that the migration can have less resources dedicated to it so more important jobs can finish?
Can the migration be interrupted and then resumed without starting over?
What configuration changes are required? Can the transition to the migrated device be done seamlessly?
Wednesday, April 06, 2011
Strange thing is it seems to be an areoplane kind of thing. First one I found was from Boeing about how they have copleted Site Acceptance Testing of a command and control system. The next one was about Areojet completing accepting testing of a mono-propeller rocket engine. And then there was another about Airtel, a company that is develops communications software for the aeronautical telecommunications network, who had completed acceptance testing of NAV Portugal’s Data Link Server.
I wonder why the aeronautical industry is so keen to tell us about its acceptance testing? Or maybe we should be asking why other business sectors are so reluctant to release news about acceptance testing.
Tuesday, December 01, 2009
SATs (System Acceptance Tests) are high-level tests intended to assure the completeness of a user story. User story acceptance tests are written to assure that each increment of software is delivering value, and that the project is progressing in a way that will ultimately satisfy the needs of the business owner. Generally:
- They are written in the language of the business domain (they are business facing tests from quadrant 2).
- They are developed in a conversation between the developers, testers, and product owner.
- While anyone can write tests, the product owner, as businessowner/customer proxy, is the primary owner of the tests.
- They are black box tests in that the y only verify that the outputs of the system meet the conditions of satisfaction without concern for how the result is achieved.
- They are carried out.
Monday, September 14, 2009
First up is some trial exam questions for the foundation level of the ISTQB (which incidentally is the same as ISEB foundation level although the two exam boards differ their qualifications in software testing for all higher level certified exams).
2.4 (K1) Which of the following is NOT part of acceptance testing?
a) Integration testing2.5 (K1) Who should have the main responsibility for acceptance testing?
b) Performance, load and stress testing
c) Requirements-based testing
d) Security testing
a) The operations group2.6 (K2) What is the main difference between system testing and acceptance testing?
b) The customer
c) The programmers
d) Usage specialists
a) System testing concentrates on functional testing, while acceptance testing concentrates on nonfunctional testing.
b) Acceptance testing is a regression test for the changes implemented during system testing.
c) System testing is done against the developers’ interpretation of the requirements, acceptance testing against customer understanding.
d) System testing is done on the development platform, while acceptance testing is done on the customer platform.
There are more software testing questions for ISTQB foundation exam here
There is also a couple of acceptance testing news items. One is about AT&T Onyx which is going into field testing and then its final technical acceptance test. The story is here. The other is about Vodafone and Verizon and the Storm 2 launch. It seems that Verizon wants additional time to do technical acceptance testing and try to get as many bugs out as possible before launch. Interesting that the acceptance test is being viewed as a quality improvement step rather than "Formal testing conducted to enable a user, customer, or other authorised entity to determine whether to accept a system or component." (More on acceptance testing servuces)
Both the Onyx and Storm stories are from Fudzilla.
Sunday, August 23, 2009
To set the scene it looks at an apparent conflict between quality and agility in development.
"Organizations don't have to sacrifice quality to obtain agility. They must rethink what they mean by quality in the SOA context. As ZapThink has discussed before when we explained the meta-requirement of agility and the SOA agility model, the agility requirement for SOA vastly complicates the quality challenge, because functional and performance testing aren't sufficient to ensure conformance of the SOA implementation with the business's agility requirement.
To support this agility requirement, therefore, traditional pre-deployment acceptance testing is impractical, as the acceptance testing process itself impedes the very agility the business requires. Instead, quality becomes an ongoing process, involving continual vigilance and attention. Quality itself, however, need not suffer, as long as the business understands the implications of implementing an agile architecture like SOA."
Tuesday, July 28, 2009
Well since this super computer exists you don't have to imagine this. The answer is $21.4M. It can run over 160 trillion calculations a second. That's 86,400,000,000,000,000.00 calculations every day. Fast enough to get it into the list of the fifty fastest computers in the world.
Its a bespoke HP computer and its undergoing acceptance testing by researches at the EMSL. Now that is one acceptance test I'd like to be on. I wonder how you get to be a researcher who tests at EMSL.
Monday, July 13, 2009
For the full story go here.
Sunday, June 21, 2009
An independent software testing company has identified a strong demand for Software Planner in the UK despite the current economic recession. Software Planner is an award winning application lifecycle management (ALM) tool that helps organisations manage all components of the Software Development Life Cycle (SDLC). This includes managing project deliverables, requirements, defects, test cases, test execution and help desk support. It also provides collaborative tools and interactive reporting dashboards to support change programmes spread across teams, locations and organisations.
Monday, March 23, 2009
Yet these users have two handicaps: they are very rarely trained testers and they often have a business as usual job to run while they do the testing as well. These mean they have little time to spare for testing and what time they have is not spent as effectively as they would like. A double whammy that often limits the value of acceptance testing SAP implementations.
This course on SAP acceptance testing is designed to address that very problem. It is also suitable for many of the other participants in SAP UAT such as business analysts, SAP implementors and managers of acceptance test.
Saturday, March 21, 2009
All businesses would wish to implement strategic IT projects more quickly, economically and surely. Above all, they need to eliminate the risk either of failure or of the project taking so long that the business needs no longer match the aims of the project. There are two important trends which will help: Agile methodology and ‘Software as a Service’ (SaaS).
The Agile approach to IT implementation offers companies a quicker and surer means of delivery. Agile delivers business benefits earlier and unlike Prince2 or Waterfall, allows feedback from the end users without having to wait for user acceptance testing.
Prioritised requirements can be bundled into short “Sprints” of work during which analysis, design/build and software testing are executed in rapid succession. In this way Agile provides a platform for projects to be tested frequently against business aims and for new system functionality to be demonstrated to end users along the way. This enables the business and IT provider to work closely together, with stakeholders involved throughout the project.
Monday, January 26, 2009
"Actional Diagnostics can detect issues early in the software development lifecycle such as policy compliance; in addition, software developers can use the product to quickly understand whether a service fits their needs before any code is written. Once services are identified for use, developers can inspect, invoke, test, and create simulations for prototyping and problem solving.
Actional Diagnostics is currently in beta testing with customers and will soon be available as a free download. Developers wishing to be sent an alert when the software download is available can register at: http://www.progress.com/web/global/alert-actional-diagnostics/index.ssp.
Actional Diagnostics helps application developers, client developers and consumers meet the demands of aggressive development schedules by providing code-free problem resolution, deep visibility into service behavior and performance, quality and compliance validation. Existing Mindreef customers will continue to benefit from SOAPscope features including early service simulation, design-time policy validation, and unit, functional, regression, and acceptance test. "
Thursday, January 01, 2009
Laurel Hill GIS has announced the release of GeoData Sentry 3.0. GeoData Sentry is an off-the-shelf automated QA/QC software product for ArcGIS geodatabases. GeoData Sentry 3.0 features a rich set of attribute, spatial and logical connectivity tests. GeoData Sentry's easy to use configuration interface allows for rapid test development and automatic test generation. GeoData Sentry's error reports are designed for ease of use with coordinated HTML files including summary metrics and details of individual errors.
"As GIS becomes mainstream within the enterprise, and organizations serve their data on the web, GIS data quality becomes critical. Advanced analytical uses require high quality GIS data. GeoData Sentry provides a platform for rapidly assessing the overall quality of the geodatabase, whether used for data acceptance testing, data migration testing or on-going data assessment and audits. This version 3.0 release allows for more geodatabase connection options, giving a wide range of geodatabase users the ability to quickly, easily and cost effectively assess the quality of their data." states Matt McCain, Vice President of Laurel Hill GIS.
Saturday, July 12, 2008
Also new in Enterprise 6.1, which is available now, are new licensing terms and automated cell planning. The software is faster than previous versions and can be adapted to upcoming LTE networks, officials said.
Other aspects of the upgraded software are speed and reporting tools. Several features work faster and are more automated than in previous versions. Developers get an application programming interfaces through the WebWizard module.
Aircom testing software including backhaul and capacity planning tools for WiMAX are used in 36 projects worldwide, the company said.
Saturday, May 31, 2008
As base-station antennas become more complex and the performance metrics more critical and challenging, acquiring and evaluating these parameters with minimal cost impact is essential. Furthermore, the ability to mine these test results for the opportunity to improve yields and/or performance is invaluable.
Consider the test requirements on a dual polarized, tri-band antenna. In this case, the manufacturer must demonstrate return loss on all 6 ports and the isolation performance between all ports. The result is 15 s-parameter measurements on one antenna. Additionally, the Passive Intermodulation (PIM) must also be verified for each of the 6 antenna inputs.
Using the Summitek Spartan software, the test procedure and acceptance criteria for all of these tests can be predefined by engineering and then served out to each client workstation at manufacturing cells anywhere in the world. This assures that all sites are testing the same way each and every day.
Customer experience testing
Sunday, February 03, 2008
The reasoning behind independently testing municipal Wi-Fi networks was covered in the first part of this tutorial series. In that tutorial, we also discussed the three main surveying and testing phases—Pre-Installation Survey, Pilot Test, and Acceptance Test. Additionally, we outlined the information and documents that need to be gathered for each phase. In this part of the tutorial series, we will discuss which tools the independent testing team needs and how to perform a Pre-Installation Survey.
Sunday, December 23, 2007
The primary objectives of preceding test phases (unit, integration and systems testing) are those of defect identification and ensuring that each step in the process of building the software delivers working products (verification).
The objective of Acceptance Testing is to give confidence that the software being developed, or changed, satisfies the specified requirements, meets customer/user expectations and is fit for business purpose (validation). Unlike the other test phases, an objective of acceptance testing is not to actively look for faults. The expectation should be that the preceding stages of testing have identified and resolved software faults and that the system is ready for operational use.
Clear entry criteria may need to be put in place to ensure that the expectation of a system being ‘production ready’ has been met.This is an opportunity to review the effectiveness, completeness and outcome of previous test phases and to declare any known problems that remain prior to acceptance test commencing.
As a testing phase with it’s own objectives, acceptance should not seek to duplicate other test phases.However, it may repeat previously run test scenarios to provide confidence that preceding test phases have been completed as appropriate. Those responsible for acceptance testing should dictate the objectives and coverage of the test phase. In doing so they are likely to consider the following:
- Have the acceptance test conditions been demonstrated?
- Do the test conditions model realistic business scenarios?
- Is there appropriate coverage of each business function?
- Is there appropriate coverage of system reports and screen?
- Is there appropriate coverage of application menu options?
- Are updated training materials and system documentation subject to test?
Where faults are uncovered during acceptance, temporary fixes should first be investigated before attempting to make changes to the software.However, it may be necessary to agree the correction of high severity/priority problems and to defer/timetable the correction of others before acceptance can occur.
In order to minimise the risk of software faults and system changes, acceptance should not just start at software delivery. It needs to be a function of the earliest design and development stages. Early involvement of users in the system requirements and design stages can minimise the risk of gaps and misinterpretations occurring in end system functionality.The benefits reaped at later stages can be the avoidance of re-work and the reduced cost of system delivery.
Saturday, December 01, 2007
Wikipedia describes acceptance testing as:
"... is black-box testing performed on a system (e.g. software, lots of manufactured mechanical parts, or batches of chemical products) prior to its delivery...
"In most environments, acceptance testing by the system provider is distinguished from acceptance testing by the customer (the user or client) prior to accepting transfer of ownership...
"A principal purpose of acceptance testing is that, once completed successfully, and provided certain additional (contractually agreed) acceptance criteria are met, the sponsors will then sign off on the system as satisfying the contract (previously agreed between sponsor and manufacturer), and deliver final payment."
This is consistent with the definition Cem Kaner uses throughout his books, courses, articles and talks, which collectively are some of the most highly referenced software testing material in the industry. The following definition is from his Black Box Software Testing course:
"Acceptance testing is done to check whether the customer should pay for the product. Usually acceptance testing is done as black box testing."
Another extremely well-referenced source of software testing terms is the International Software Testing Qualifications Board (ISTQB) Standard glossary of terms. Below is the definition from Version 1.3 (dd. May, 31, 2007):
"Formal testing with respect to user needs, requirements, and business processes conducted to determine whether or not a system satisfies the acceptance criteria and to enable the user, customers or other authorised entity to determine whether or not to accept the system."
I've chosen those three references because I've found that if Wikipedia, Cem Kaner and the ISTQB are on the same page related to a term or the definition of a concept, then the testing community at large will tend to use those terms in a manner that is consistent with these resources. Acceptance testing, however, is an exception.
There are several key points on which these definitions/descriptions agree:
- In each case, acceptance testing is done to determine whether the application or product is acceptable to someone or some organisation and/or if the person or organisation should pay for the application or product AS IS.
- "Finding defects" is not so much as mentioned in any of those definitions/descriptions, but each implies that defects jeopardise whether the application or product becomes "accepted."
- Pre-determined, explicitly stated, mutually agreed-upon criteria (between the creator of the application or product and the person or group that is paying for or otherwise commissioned the software) are the basis for non-acceptance.
Wikipedia refers to this agreement as a contract, and identifies that non-compliance with the terms of the contract as a reason for non-payment. And Kaner references the contract through the question of whether the customer should pay for the product.
The ISTQB does not directly refer to either contract or payment but certainly implies the existence of a contract (an agreement between two or more parties for the doing or not doing of something specified).
Saturday, November 17, 2007
Seventy percent of all software projects fail outright. When considering the other 30% of "successful" projects, you find many cases where the organisations involved merely declared victory and shut everything down.
But why do projects fail? Missing, miscommunicated, or misunderstood requirements are by far the biggest reasons. Hopefully I've given you some tools to help make these problems go away.
A close second to all this? Lack of end-user acceptance. Why? Because rarely are front-line troops—the ones who will use your application—ever invited to stakeholder meetings. They never get to give any input. Their managers or outside consultants are the only ones you ever meet. Unfortunately, these people don't have the kind of visibility actual users of your system have.
Ignore the end user at your own peril! Most of the time, the stakeholders you're dealing with don't know you need to talk to end users at all. You're the expert, not them! Even if they present you with reams of documents in which they've covered every conceivable request from end users, request a tour of the office or plant floor. Meet some end users. Take them out to lunch. Talk to them early, see what's bugging them.
Spend an afternoon just hanging out watching what happens. How do their processes get them through the day? Where do their processes get in the way? Where do they put in short cuts to avoid bureaucratic nightmares? Does person A from this department do small favours for person B in finance to expedite work orders?
If you ignore the end user at the beginning of the process, if you don't talk to them upstream, you certainly will be talking to them at the end of the process, when you deliver the final application for acceptance testing and/or training.
Guess what? Those stakeholders who didn't give you access to the end users early on will be asking those end users at the end of the project what they think of the new application, and you'll be judged by what they say. Fair? No. But you didn't ask, and they didn't have that kind of visibility into your expertise.
Remember, no one can verify you're on the right track more than an end-user who is an expert on a particular procedure or workflow.
Customer acceptance testing UK
Saturday, November 03, 2007
The Election Assistance Commission has issued six new sets of guidelines for election officials to help them with the chores of managing local elections.
The guidelines, several of which address technical issues, are part of a series of Quick Start Management Guides being produced by the commission. They are available online and also are being distributed as brochures to election officials. The commission plans to issue two new Quick Start brochures by the end of the year on military and overseas voting, and developing an audit trail.
The guide on acceptance testing explains what an acceptance test is and covers how to conduct a test for optical-scan ballot scanners, touch-screen voting stations and election management systems. A guide for contingency and disaster planning includes information on developing a management plan to address emergency and crisis response. It also covers developing a continuity-of-operations plan for elections and contingency planning for countywide events and internal operations.
Other guidelines cover absentee voting and voting by mail, managing change in an election office, media and public relations, and selection and preparation of polling places and vote centers.
To request copies or for more information on Quick Start Management Guides, call EAC at (866) 747-1471 or e-mail HAVAinfo@eac.gov. For information on EAC's voting system certification program, including test labs, registered voting system manufacturers, voting systems that have been submitted for testing, test plans, notices of clarification and other program-related information, visit http://www.eac.gov.
Saturday, October 27, 2007
The review was highly critical of the fact that "this technical problem had not occurred during the testing activities undertaken by DRS or by the Project Board".
It found there were a number of small differences in the configuration of the system for the test and for the actual count that were significant to its operational ability to scale.
For example, the most significant of these differences was that the test configuration only had information relating to the set-up at a single count centre, with 50 entries in the 'contest' table, which related to the total number of wards and constituencies to be counted each centre. But the final configuration used at each count centre contained information relating to all count centres for resilience purposes, which resulted in about 550 entries in the 'contest' table.
The Commission also said the 32 Returning Officers individually responsible for election administration in their constituencies found themselves "swimming upstream" against the introduction of the new technology.
And it said the appointment of one contractor to provide the electronic counting system, using the same technology, equipment and software across the whole of Scotland made it difficult for regionally dispersed Returning Officers to take effective remedial action when problems occurred.
Contractor DRS welcomed the publication of the review. It said it fully supported "Gould's proposed review of legislation and political involvement to ensure that e-counting technology is properly integrated into the electoral process".
DRS also told IT PRO that, following the May elections, DRS have completed a full evaluation of its e-counting systems performance and of the issue experienced.
It has subsequently extended its software to include new automated features, like updating of statistics prior to the start of each count; the rebuild of indexes prior to each count starting; and the recompiling of stored procedures prior to each count starting. User level monitoring and early warning of database performance issues arising during the count, with user level capability for corrective actions, have also been added.
The report also called for legislative and policy frameworks that guide electronic counting and better, as well as more comprehensive pilot phases and user acceptance testing.
Sunday, September 16, 2007
INRIX, the leading provider of traffic flow information in the country, and Short Elliott Hendrickson Inc. (SEH(R)), a Midwest-based professional consulting services firm, today announced the system acceptance by the Wisconsin Department of Transportation (WisDOT) for INRIX to provide real time traffic flow data for nearly 250 centerline miles of US 41, I-43, and WIS 172. This milestone makes INRIX the first company to provide operational vehicle-probe based traffic data services to a government agency.
The process of procuring privately sourced GPS and/or cellular probe-based traffic data began approximately one year ago in order for the WisDOT to assess the utility and cost-effectiveness of such alternative traffic data sources in support, or in place of, traditional ITS data collection. INRIX was able to fully meet the strict project requirements regarding data quality, precision and availability which will allow the successful integration of this information within WisDOT's TOC.
"We are proud that INRIX and SEH, working closely with Wisconsin DOT and its team, have been able to verify satisfaction of the requirements set forth in the original RFP," said Rick Schuman, INRIX vice president, public sector. "It is a testimony to the commitment and quality of our team, as well as that of WisDOT and its support resources, to move from kick-off to approval in roughly six months, which included nearly two months of extensive acceptance testing. We are pleased to be the first company to provide operationally and under contract vehicle-probe based intercity traffic data services to a government agency, and the first to deliver a project of this scope within the original schedule."
Until now, these routes lacked traditional road sensors. With the INRIX Smart Dust Network, traffic information is now available by leveraging data from more than 250 public and private sources, including the world's largest network of GPS-enabled probe vehicles, and other data sources. INRIX's unique Traffic Fusion Engine uses advanced, proprietary statistical analysis to aggregate, blend and enhance traffic-related information, ensuring highly accurate road speeds and the broadest market and road coverage available. For the first time, commuters in Wisconsin will have access to INRIX real time traffic information that, in addition to WisDOT resources, is available on over 75 models of portable and automotive navigation devices, mobile phones and smartphones.
Jim Hanson, PE, PTOE, principal, transportation services for SEH, said, "Providing accurate travel information helps keep our roadways and our travelers safe, and INRIX has provided the means to achieve real time traffic data to help accomplish this goal. The acceptance test results verified our strong belief from the initial proposal phase that INRIX data would satisfy the rigorous accuracy, timeliness and reliability requirements demanded by WisDOT, even outside of urban areas."
Saturday, August 11, 2007
BAE Systems have announced details of its Block 5 Retrofit 2 upgrade program for the British Royal Air Force's Eurofighter.
"The maiden flight of the first Block 5 production aircraft has already demonstrated the benefits of continued improvements on the Typhoon development program," BAE Systems said.
“The aircraft performed exceptionally well, achieving a high level of performance. This production flight acceptance testing flight was one of the best ever, achieving 90 percent of the planned schedule,” said Martin Topping, Typhoon final assembly operations manager.
"The Block 5 standard gives Typhoon full air-to-air and initial air-to-ground capability with full carefree handling," BAE Systems said.
Saturday, July 21, 2007
Emerson is to engineer and manage the automation project, using Foundation fieldbus technology for process control and data management, as well as OPC (OLE for Process Control) technology.
As main instrument vendor, Emerson will be responsible for detailed design, including engineering design and implementation.
“We are pleased at this new opportunity to combine our technology with the solutions skill of our Chemical Centre of Excellence and the engineering supervision and project management leadership in our Singapore office,” says John Berra, president, Emerson Process Management.
“We are encouraged by BP’s confidence in our ability to provide system design, configuration, staging, acceptance testing, commissioning, and start-up support. We are committed to delivering a world class digital automation system.”
Saturday, June 30, 2007
This can be viewed by some as the riskiest part of a new system implementation. However, with proper time and resources, migrations need not be feared. They are all part of the system migration, which is to enable the company to improve its ROI and competitiveness. Each company needs to assess the cost, time and benefits, and if sufficient resources and time are allocated, success is manageable.
In single-phase conversions, all data is converted at one time with two options. Sufficient history levels (number of years) are converted that enables the discontinuance of the existing system. And the latest versions of data are converted to allow the new system to be used for all future transactions. The existing system is retained for a period of time for inquiry purposes into past transactions.
The advantages of a single-phase conversion are that the required resources are used for the lowest amount of time and therefore costs. Longer conversions run the risk of losing some resources, which need to be replaced, leading to a lengthier task.
In a multi-phase conversion, data is segmented by line of business or territory. The initial phase will encompass the installation of the whole system but will include only a portion of the portfolio. This will include underwriting, billing, claims, reinsurance, document issuance, management reporting, financial reporting and bureau filing. Each module will be tested using a portion of the portfolio. The initial phase is comparable to a single-phase implementation as the whole system is tested.
Subsequent phases involve adding other parts of the portfolio. This mostly affects the underwriting system to ensure proper rating, document issuance etc. These are much smaller phases and can be achieved with fewer resources. The portfolio being added needs to be tested through all modules (billing, claims, reporting, etc.). However as these modules have already been tested the additional work relates to tracing the new transactions through the system to ensure that they are being processed correctly.
The multi-phase approach can be implemented where each phase is implemented, user acceptance testing completed and then goes live as they are completed. Alternatively, each phase is completed up to the user acceptance testing, and only after all parts of the portfolio have been completed is the system put into live production.
Sunday, June 03, 2007
"In this phase, the system is fully tested by the client community against the requirements defined in the analysis and design stages, corrections are made as required, and the Production system is built. "
It contains an outline of the deliverables (along with templates of some of them such as the User acceptance sign off template. It also contains a 10 step approach for user acceptance testing (UAT). It is part of the overall Princeton development method for constructing information technology systems and applications.
Saturday, March 31, 2007
From Find out how “IT” works.
In this case, “IT” is not limited to information technology, but whatever “it” is – the requirements process, the project management process, the testing process, etc. The better you as a customer understand “it,” the more effective you will be in making “it” work for you.
Donna Corrigan from CoBank suggests: “When defining requirements, stay focused on value-added business requirements, not technology. Understand the user acceptance testing (UAT) schedule early on, and clear the calendars of UAT business staff during the UAT period.”
Bill Yock from The University of Washington says: "The most important thing a customer of a BI project should insist on is "World Class" training. Incorporate training on both the meaning of the data and the use of the business intelligence tools. Many BI projects invest too much in the ‘back room’ data loading and integration of warehouses and marts, and not enough in the ‘front door’ … educating and informing users on how to interpret, analyze and understand their information. One of our mission statements was ‘Free the Data.’ Unfortunately, once the data was free, nobody understood it. This led to lack of confidence in the ‘numbers’ and adoption of the BI tool hit a brick wall."
To be a good BI customer, figure out what “it” is and how you can proactively influence and invest time in requirements, data models, prioritization, governance, testing, training, etc. Don’t allow yourself to be so dependent that you passively sit back and wait for “IT” to make everything happen.
Saturday, March 24, 2007
Taco Bell has become the latest national fast food chain to enter the contactless arena. It will be testing Visa contactless cards at 100 locations later this year with an eye towards implementing the technology at all 1,300 locations in 2008. The credit card giant also announced that a movie theater, a northeast supermarket operation and an ice cream chain are now accepting Visa contactless cards.
Visa USA announced that four significant national and regional merchants are the latest to accept Visa cards with embedded contactless payment technology, allowing consumers to use the convenience of this innovative technology to buy food, groceries and movie tickets at more than 3,000 new acceptance locations.
The four companies – Braum's Ice Cream and Dairy Stores; Clearview Cinemas; Wakefern Food Corporation, the merchandising and distribution arm of ShopRite supermarkets; and Taco Bell Corp. – are emblematic of the kind of merchants who have helped to make Visa Contactless cards the most rapidly adopted new payment technology in Visa’s history.
Visa Contactless technology enables customers to make purchases by simply holding their Visa Contactless cards near a secure reader at checkout instead of swiping them. The account information is securely transmitted by a tiny radio frequency chip embedded in the card.
Visa Contactless eliminates the signature requirement for most payments under $25, providing even greater convenience to customers making small ticket purchases. In addition, Visa Contactless maintains the level of security the industry has come to expect from Visa. A Visa card with the contactless feature never leaves the customer’s hand and can only be read at extremely close proximity to a contactless reader. In addition, the chip embedded in Visa Contactless cards generates data that is unique to each transaction, ensuring the same data cannot be used for fraudulent contactless transaction.
Testing Acceptance User
Saturday, February 24, 2007
It covers the Acceptance Test Plan and identifies the key elements as:
- Customer responsibilities: Describe the activities the customer is responsible for and what others are responsible for. On some projects, the customer is responsible for all aspects of the test, including creating test scenarios, performing the tests, and validating the results. On other projects, the entire teams may assist in the various activities.
- Acceptance criteria: Before you begin the acceptance test, the team should know what criteria will be used to decide whether the system is acceptable or not. Does the system have to be perfect? You better hope not. But the acceptance criteria should define how the decision will be made. The customer may accept a system with certain minor types of errors remaining. But there may be other levels of errors that will render the system unacceptable. Part of the acceptance criteria may be to revalidate some of the other system tests. For instance, the customer may want to thoroughly test security, response times, functionality, etc. even if some of these tests were done as a part of system testing.
- Acceptance Test Workplan: Here, you define the activities associated with the acceptance test, when they will begin and end, who is responsible, and so on. In other words, you define the workplan for the test. Defining all this up front will help people understand what is expected of them, and they'll be aware of the timeframe and know when things are due. You should move all this information to the main project workplan.
Wednesday, January 24, 2007
We can have perfect systems and procedures for quality, but if the employees are not willing to follow, the whole system will fail to deliver. All companies in the services sector are victims of this syndrome. When the quality movement started, some shady agencies deceived their clients by promising that a quality assurance system, a quality certification and a quality assurance manager can ensure quality of the output. A QA system is as good or bad as the quality of those who implement it, and no company with sub-standard employees can provide standard services even if there is an ISO certified QA system and an expert QA Manager in place.
The most significant development in the rush for quality has been the unprecedented growth in the number of certification agencies. From a few international players the number has grown in hundreds the world over. The initial few had a bountiful harvest when the rush started, and that provided an ugly model for numerous "operators" to try their hand at a low-investment-high-return business opportunity. The simple fact that most of the new certification agencies have their principals in Europe and it was the European Union that triggered the quality juggernaut makes the whole exercise suspect.
The biggest beneficiary of this erroneous concept of quality assurance has been a set of professionals masquerading as quality assurance managers (QAM) in ISO-certified companies. They first descend on the victims (desperate for ISO certification) in the form of experts or consultants willing to assist in getting the initial certification. It goes to their credit that they do a useful job at this stage. In most companies, there will be severe shortage of expertise to document the procedures they follow in a systematic manner. This vacuum is easily filled by the so-called ISO experts and consultants. After marathon sessions with the employees, an acceptable and auditable quality system is set up. Then these quality consultants (sometimes in the form of a full-fledged company) spend months upon months conducting mock audits, feedbacks and corrections, until finally the whole system is ready for offering to an accredited external agency for certification. It is always up to these ISO consultants to conclude their deal by "arranging" a smooth audit and certification as per ISO norms.
The dubious and redundant aspect of quality management starts from this point. In order to maintain the certification and withstand the periodic surveillance audits of the external agency, most companies are too willing to accommodate these consultants or experts in the form of a quality assurance manager. Guidelines of ISO have made it easier for these "redundant" managers to report directly to the chief executive in order to keep "quality" as a high level and independent function in the organisation. And that makes it easier for these climbers to stay on forever. Most of the time, their work is reduced to arranging "facilities" for the external auditors and "taking care" of them during the audits. That both these efforts do not necessarily contribute to the quality of the host organization is a simple fact that misses the attention of the chief executives. In fact, they are counterproductive and retrograde. In the long run almost all such companies end up uneconomical, thanks to their misplaced overemphasis on quality rather than profit. Quality is meaningful only in a profit-making company and all the efforts for it must subordinate the ultimate objective of making profit.
Saturday, December 16, 2006
From Vote trust USA
Adequately tested voting systems are a prerequisite for well run elections and to ensure public confidence in election results. When it is completed, the current process of testing voting systems will culminate in New York State’s four Election Commissioners deciding to approve, or “certify” those systems which meet the State’s regulations. Certification testing is ongoing and is proving to be a mixture of good and bad news.
The principle contractor managing certification testing for the State Board of Elections is CIBER, Inc., one of three large testing companies which have close ties to the voting machine vendors. CIBER was responsible for the certification of several voting systems that were later shown to have defective software and defects, and were subsequently de-certified by several states. CIBER’s performance for New York State has thus far been poor, tending to favor machine vendors’ very loose interpretation of State requirements. Ciber has submitted inadequate drafts of Master Test and Security Test plans, and seems willing to tolerate poor testing practices in order to allow machines to pass tests they otherwise would not.
User acceptance testing definition
Sunday, October 22, 2006
Corporations in Australia are radically reshaping the way they buy technology products and services, as they abandon the large-scale open tenders that have dominated purchasing. Australia's biggest computer users are instead opting to put technology suppliers through their paces in small, handpicked groups in an effort to shorten the procurement cycle and slash costs.
Government departments are getting in on the act too, with more using supplier panels.The changes could save public and private organisations hundreds of thousands, if not millions of dollars on every project they launch.
"The whole tendering process is expensive. It's quite time consuming both for companies looking to do business with a customer and for the customer itself," CSC Australia managing director Mike Shove says. "Rather than going to a large open tender, organisations are now doing their own due diligence and selecting two to five different players to engage with." The trend is evident throughout the private sector and is affecting everything from software procurement to multimillion-dollar outsourcing contracts, Shove says.
It is being used to pick winners for both single supplier and multi-sourcing deals and has been aided by a wave of consolidation that has swept through the computer and communications sector over the past decade.
While there isn't a confirmed link between the demand for independent advisers and more limited tendering, Ernst&Young technology and security risk services partner Chris Grant says the two practices can go hand in hand.
Companies may select a preferred supplier and then, with the assistant of an assurance adviser, test the proposed product or service before signing on the dotted line. "If you're looking for an accounting or finance-type solution then you can pretty much go to a shortlist of who the suppliers are, do a quick evaluation and then proceed to planning and a contract," Grant says.
Sunday, October 08, 2006
The Eurofighter ASTA team on Friday completed the Acceptance Testing of the latest software for the Aircrew Synthetic Training Aids (ASTA) with software load 1.0 standard in Manching. This new software will significantly increase the training capabilities of the Eurofighter Typhoon partner air forces especially in the field of mission simulation.
Acceptance testing service
Saturday, September 30, 2006
"When you need to test things that are not covered by the Rails testing framework (i.e. end-to-end AJAX), open source web testing toolkits are your best friend. While Selenium has merit and is particularly well-suited for cross-browser compatibility tests, this session will focus on putting Watir on Rails. Watir provides a clean API, simple setup, good integration into the Rails testing framework, and is supported in a growing number of browsers. Dave will demonstrate how to use Watir to test a Rails+AJAX application."
Sunday, September 17, 2006
The seminar will address continuing education requirements, acceptance criteria for existing installations, incident reporting, investigation procedures, periodic testing requirements, and recognition of existing municipal elevator inspection programs.
For more info visit the Guymon Daily Herald
For information on services for setting effective acceptance criteria
Thursday, September 14, 2006
This site defines the document as:
"An Acceptance Test Plan describes the acceptance testing process, such as the features to be tested, pass/fail criteria, approach to testing, roles and responsibilities, resource requirements and schedules."
The contents of the acceptance test plan itself are identified as:Acceptance Test Plan
- Business Processes Testing
- Testing Approach
- Test Schedule
- Problem Reporting
- Resource Requirements
- Test Environment
- Post-Delivery Tests
- Test Equipment
- Software Requirements
- Hardware Requirements
- Test Identification
- Acceptance Test Report
- Corrective Action
- Summary of Results
Does anyone know any other sites where you can doenload Acceptance test plan templates? Even better if they were free.
Thursday, August 31, 2006
User Acceptance Testing is a key feature of projects to implement new systems or processes. It is the formal means by which we ensure that the new system or process does actually meet the essential user requirements. Each module to be implemented will be subject to one or more User Acceptance Tests (UAT) before being ‘signed off’ as meeting user needs. The following overview answers some of the main questions that have been asked about UATs.
A User Acceptance Test is:
- A chance to completely test business processes and software
- A scaled-down or condensed version of a system
- The final UAT for each system module is the last chance to perform the above in a test situation
being tested. In general however, tests will cover the following broad areas:
- A number of defined test cases using quality data to validate end-to-end business processes
- A comparison of actual test results against expected results
- A meeting/discussion forum to evaluate the process and facilitate issue resolution
- Validate system set-up for transactions and user access
- Confirm use of system in performing business processes
- Verify performance on business critical functions
- • Confirm integrity of converted and additional data, for example values that appear in a look-up table
- •Assess and sign off go-live readiness
Sunday, August 27, 2006
"Developer Persistent Worlds was very pleased to announce that Carpe Diem, its newest fantasy MMORPG offering, has entered its Alpha Stage today.
Alpha Testing, otherwise known as Acceptance Testing, is wherein a new product in pre-release is tested internally before testing it with outside users. Says Persistent Worlds' Director of Technology and Operations, Phil White, "The Alpha test will allow us to ensure that all servers, support and game play changes are working properly before moving to the next stage."
The Alpha Test appears to be going well so far, and they expect to go to closed beta in about two weeks. Already being highly-anticipated, Carpe Diem already has received heaps of requests from gamers who want to take part in the game's Beta test. The positive response from the beta test survey also contributes to the optimistic outlook of the game's developer."
Alpha testing services for software applications
Sunday, July 30, 2006
In this article from PC Welt they desribe the new features added to BPT part of Quality Center which include tighter integration with the automated functional testing tool Winrunner. The part we are interested in is the certification step added to the software testing process whcih covers user acceptance.
It features a Web interface that business executives to try out a new aoftware application. This is meant to automate a step that is usually performed in an ad-hoc manner.
Saturday, July 22, 2006
"UAT; a method for determining how well users have adopted a new technology, especially in organizational settings. Users are typically interviewed to determine if and how they are using the technology and to understand what barriers to adoption may exist.
This sometimes refers to a "user testing" technique where users are asked to perform a scripted list of tasks with every step outlined and to comment on usability problems they encounter. However, this is more a quality assurance technique than user testing because scripted tasks don't reveal problems that users would encounter in actual use."
Software acceptance testing
Sunday, July 02, 2006
"Quality assurance (QA) is an essential part of the XP process. On some projects QA is done by a separate group, while on others QA will be an integrated into the development team itself. In either case XP requires development to have much closer relationship with QA.
Acceptance tests should be automated so they can be run often. The acceptance test score is published to the team. It is the team's responsibility to schedule time each iteration to fix any failed tests.
The name acceptance tests was changed from functional tests. This better reflects the intent, which is to guarantee that a customers requirements have been met and the system is acceptable."
Independent acceptance testing services for agile developments
Wednesday, June 28, 2006
"1) The Test Plan- This outlines the Testing Strategy
2) The UAT Test cases – The Test cases help the team to effectively test the application
3) The Test Log – This is a log of all the test cases executed and the actual results.
4) User Sign Off – This indicates that the customer finds the product delivered to their satisfaction"
Tuesday, June 27, 2006
"Quality assurance personnel use performance and reliability testing to verify service levels of the system as a whole. While each of these techniques serves a unique purpose on the project, they also share a common goal - find defects. However, this is something your user acceptance tests should never do."
However, when you read the well written item, it is actually a plea to improve testing early in the lifecycle:
"Numerous studies have shown that fixing defects found late in a delivery cycle can carry an extremely high cost. If serious problems are found during User Acceptance Testing (or UAT), they can easily translate into significant budget overruns, slips of the schedule, or even project cancellation. Issues must be identified and addressed as early as possible. By the time UAT rolls around, there should be nothing major left to uncover. User acceptance testing should be little more than a pat on the back of your development team for a job well done."
I like the idea of UAT being a pat on the back very much.
Saturday, June 24, 2006
The contents of the document are:
2. User acceptance test standards
2.2 New System Test
2.3 Regression Test
2.4 Limited Testing
2.4.1 Form-Based Testing
2.4.2 Business Process Testing
2.4.3 Report Testing
3. Assessing the test types for a test plan
3.2 Assessing the Test Type Required
4. Creating a user acceptance test plan
4.1 Test Instructions
4.2 Form-Based Test Scripts
4.3 User Security Matrix
4.4 Business Process Test Scripts
4.5 Report Test Scripts
4.6 Defect Tracking Form
4.7 Defect Tracking Log
And the conclusion is:
"The test scripts created for a new system, for regression testing or for limited testing should be recycled throughout the application 's life. By editing existing test scripts the User Acceptance Test Planning process can save time and money, as well as maintain the test quality, as key requirement information will be re-used."
Acceptance test planning support
Thursday, June 22, 2006
"The NSW Department of Primary Industries (NSW DPI) has conceded its technology consolidation project is still suffering delays due partly to an incident which saw much of the computer equipment in its facility in regional NSW "slow-baked" in searing temperatures.
Warwick Lill, host systems manager, NSW DPI, said a faulty fire detection system at the start of the year had contributed to the agency's data centre reference implementation (DCRI) project, an 18-month long undertaking, running several months behind schedule.
The problem occurred at NSW DPI's site at Maitland in the Hunter Valley.
"At about 10 minutes to midnight on New Years Eve there was a false alarm in the fire detection system in Maitland, which shut down the air conditioning in the computer room without shutting down the computers," Lill told attendees at Gartner's data centre conference in Sydney.
The state experienced record temperatures on new year's day.
"This wasn't detected until about 11 o'clock in the morning," he said.
"By that time the temperature inside the computer room was up around 70 degrees or something.
"That sort of slow-baked all of the equipment that was in the room, and we're in essence still recovering from that one. So work at the Maitland site in regards to this project has come to a stop until we manage to fix that."
The DCRI project aims to consolidate a large swathe of information technology operations inherited from the NSW DPI's predecessor agencies to sites in Maitland and Orange under moves to generate efficiencies and cut costs. The NSW government created the DPI in July 2004 through the amalgamation of Mineral Resources NSW, NSW Agriculture, NSW Fisheries and State Forests NSW.
"We had too many IT managers, we had too many systems, we had too many ways of doing things," said Lill.
The DCRI project has required extensive planning and preparation in areas such as disaster recovery and system layout.
"We wanted to setup a mutual failover between those two sites, that's the basis of our disaster recovery plan," said Lill.
"We wanted to rationalise the system landscape that we had, so we had dedicated systems for production, user acceptance testing, development, DR [disaster recovery] and so on."
NSW DPI has also acquired a range of hardware and software for the project, primarily from services partner Sun.
"What we've put in, in order of implementation, was a management cluster with dedicated storage at each of our two major sites."
This project was not without its problems, however.
"We moved a little too quickly from the planning phase into the implementation phase, and that combined with the fact that we didn't know quite as much about clustering as we might've needed to do, we didn't quite get the design for the original management cluster right.
"It took a period of time to recognise that and to agree to redo it, so that's put us behind by about three months."
Also implemented were new tape libraries at both sites, and Sun's C2MS (e-mail archiving), SAM-FS (archive management) and Management Centre (system management) software.
The choice of EMC Legato NetWorker as enterprise backup product, however, had been another contributor to delays in the project.
"We've found that the Legato backup software, the database agents for Legato backup software, are not certified for operation in Solaris 10 containers. So at the moment we can't back up our databases if we move them to the application cluster.
"So we're looking forward to a solution to that one in the reasonably near future," he said.
Lill described the work of services partner Sun as "terrific," but conceded there had been early problems in the relationship between the two.
"We bought a design I believe ... whereas Sun sold us a set of hardware and software and professional services during the implementation.
"Initially at least there was a bit of a disconnect between those two, but through communication and influencing and so on we've managed to bring things pretty well together.
"If I was doing it again I think I'd spend more time in the contract negotiation stage looking at the very fine detail of what was proposed."
Lill said he was confident the project would be completed and work well."Interesting that if he was doing it again the key thing he'd concentrate on is the specification of what he was getting. Having an effective acceptance process and clear acceptance criteria is a key step in getting what you expect when you expect it.
Tuesday, June 20, 2006
A couple of useful passages:
"In fact, depending on how your other tests were performed, this final test may not always be necessary. If the customers and users participated in system tests, such as requirements testing and usability testing, they may not need to perform a formal acceptance test. However, this additional test will probably be required for the customer to give final approval for the system."
Combining user acceptance testing with other testing activities saves time and money.
"The acceptance test is the last opportunity customers have to make sure that the system is what they asked for. When this final test is complete, the team expects that the customer will formally approve the system or point out any problems that still need to be resolved. Therefore, unlike all the other tests performed so far, acceptance testing is the customers' responsibility. Of course, unless the customers are very savvy in testing techniques, they will still need the participation of the IT team."
Using specialist software testing support ensures your testing is effective and efficient. Given that this is the final test it needs to be effective.
Sunday, June 18, 2006
In software development, user acceptance testing (UAT) - also called beta testing, application testing, and end user testing - is a phase of software development in which the software is tested in the "real world" by the intended audience. UAT can be done by in-house testing in which volunteers or paid test subjects use the software or, more typically for widely-distributed software, by making the test version available for downloading and free trial over the Web. The experiences of the early users are forwarded back to the developers who make final changes before releasing the software commercially.
Saturday, June 17, 2006
User Acceptance Testing (UAT) is a process to obtain confirmation by a Subject Matter Expert (SME), preferably the owner or client of the object under test, through trial or review, that the modification or addition meets mutually agreed-upon requirements. In software development, UAT is one of the final stages of a project and will often occur before a client or customer accepts a new system.
User acceptance testing services