Managing Quality

Managing Quality

Of all the aspects of running a project, probably the most difficult one to keep on track is quality-not because it is complicated but because project participants feel free to discard it when the crunch comes.

Quality in systems projects is based on reviews and walkthroughs (see "Planning for Quality"). Your role is to ensure that those activities occur and that no deliverable will be considered complete unless it is accompanied by a walk-through review worksheet (see Exhibit 4.10) indicating that a review was conducted and that all revisions agreed to in the review have been completed. Insofar as your project is concerned, this is all that is required of you to manage quality. However, quality is long-term. It is long-term within a project because its benefits are not realized until late in the project and after the project is over. It is also long-term across projects in that the practice of quality in individual projects leads to improvements in those that follow. If you are to be successful in managing quality over the long term, you will have to sell quality to your team and to your management. You do that in two concrete ways: Gather ammunition, and compile statistics.

You gather ammunition by extrapolating from the completed review worksheets what each major uncovered error would have cost if it had not been detected when it was. For example, if a requirements review catches a previously unidentified need to record prices in fractions of a cent ("We buy cable at twelve and a-half cents a foot"), prepare a set of projections indicating what that requirement would have cost had it been discovered at various later stages in the project. For example:

45b385dc2ce46b62cb4c80a951ea9e72.gif

At the requirements stage, it took one hour of effort to revise the requirements document.

45b385dc2ce46b62cb4c80a951ea9e72.gif

At systems design, it would have taken one week of effort to revise screen, report, and file layouts.

45b385dc2ce46b62cb4c80a951ea9e72.gif

At program design, it would have taken one month of effort to revise layouts and program specifications.

45b385dc2ce46b62cb4c80a951ea9e72.gif

At implementation, it would have taken six months of effort to rivise the entire system and documentation.

A few such examples should be enough to convince your most intractable opponent of the hard benefits that come from building in quality to the product through reviews.

As the project progresses, keep a set of statistics that reflect the impact of your quality plan on the project. If the client organization already has a statistical base, use the measures that it contains. If not, define a few of your own. Some examples:

45b385dc2ce46b62cb4c80a951ea9e72.gif

Time spent on unit testing as a percentage of time spent on coding

45b385dc2ce46b62cb4c80a951ea9e72.gif

Number of bugs per program discovered during integration

45b385dc2ce46b62cb4c80a951ea9e72.gif

Number of revisions needed to files or databases after completion of systems design

45b385dc2ce46b62cb4c80a951ea9e72.gif

Number of revisions needed to program specifications after start of program design

Over time, particularly if you work within the same organization, you should see these numbers decline. If you are successful, the last three will reach zero. On that day, you can approach your management and claim the kudos that are rightfully yours.

Managing Peer Reviews

As discussed in "Planning for Quality," the most important tool for building in quality to a systems product is the peer review. Normally, this is a technical meeting to walk through a deliverable and identify and document potential problems. However, like any activity that throws team members together, it has some potential problems that you can circumvent by following a few basic principles.

Participation should be restricted to technical staff

Two groups that should not be present at a review are clients and management (including you). You want to exclude clients because they should not be involved in internal project meetings, particularly those that focus on errors and problems. You want to exclude management because the presence of managers will exacerbate any defensiveness the team member is feeling. It will also switch the focus of the review from a search for technical compliance to one of performance evaluation. The sole exception to this rule is that if the personalities involved will create problems, you should be present as a facilitator and, if necessary, as a defender of your people (see "Team Building").

Peer review does not require approval

The role of reviewers is strictly to point out problem areas and suggest improvements. Other than enforcing conformance to standards, no approval is required and, when reviewers and the team member who produced the deliverable disagree about some aspect of it, the team member's decision is final: It is his or her product.

Comments should be specific, not general

The purpose of a review is to identify errors, not redesign the deliverable. Therefore, the kinds of comments that are appropriate are specific, such as "Must handle return code 13" or "Add logic to deal with incorrect date format." Where comments become general, such as "Approach needs to be revised" or ''Work is not at the expected level," the review has ceased to be a search for errors and become degraded into a battle over approaches or an exercise in intimidation. You will need to intervene and to reinforce with your team the kinds of comments you expect to see.

Peer review is not employee evaluation

Peer review can be stressful, particularly for less secure team members. You must therefore emphasize that its purpose is not to evaluate work or punish team members but to identify sources of error. You will need to be scrupulous in stamping out any attempts to use peer reviews to discipline or embarrass team members. You will also need to make sure that reviewers do not use the process to impose their preferred approach onto the deliverable.

What If?

Team Members Object to Conducting Reviews.

Your quality plan will be dead and you can expect the time for integration and systems test to mushroom.

Actions

There will typically be two reasons for objections: "I don't have the time to review someone else's work" or "I'm a professional and I don't need someone looking over my shoulder." The specific cause may lead you to modify what you say, but the overall message must be the same: "The quality plan is not negotiable. We need it to succeed, and I expect you to follow it."

Examine your own position and make sure that you are not conveying to your team any impatience with delays that are caused by reviews. If a deliverable is a week late because significant errors were caught, you should celebrate and congratulate the participants because the review has just saved several weeks or months of effort later. If, on the other hand, you express dissatisfaction with the delay, you have provided an incentive to your team to avoid reviews.

The Client Says That Your Team is Spending Too Much Time in Review Meetings And Not Enough Time Working.

The client is attempting to get you to scrap your quality plan. In particular, if the project is behind schedule, the client sees this as one way to catch up or at least to avoid falling further behind.

Actions

Recognize that the client is challenging your mandate to manage the project and that the issue goes beyond your quality plan.

Point out that quality reviews are not the cause of any slippage and that scrapping them would ultimately cause even more delays.

Finally, insist that the quality plan be retained and followed.

Your management Questions Your Adherence to The Quality Plan.

If your quality plan is scrapped, the time spent on subsequent phases of the project will increase beyond your estimates. You risk having to answer for later slippages that your plan would have helped you avoid.

Actions

If your management says something like "We're all for quality, but we have a schedule commitment, and right now, you are overdoing itjust get on with the work," point out that quality reviews are not contingent but an integral part of the plan, with benefits to be realized upon integration, systems test, and implementation.

If your management orders you to stop the reviews, comply, but reestimate the project, adding on time for later activities. Point out that the revised work plan is a consequence of your management's demands.

Managers Request Review Worksheets to Evaluate Employee Preformance.

If team members believe that reviews are being used to evaluate performance, they will become even more reluctant to participate. If you  have succeeded in building a supportive team, they may even tacitly agree not to document errors in order to protect each other. The benefits of reviews will be lost.

Actions

Point out to management that performance reviews should be based on results, not process. The only relevant issue is that the team member's deliverables are produced on time and are of good quality. The number of intermediate errors or revisions a team member makes should not reflect on his or her performance review.

Decline to hand over review worksheets to management. If you are required to do so, revise your work plan, removing all quality reviews and extending the time needed for later activities. Make it clear to your management that quality reviews work only if they are restricted to detecting and correcting errors. Any other use compromises their effectiveness, and there is no point in continuing with them.

Some Team Members Use Reviews to Intemidate or to Force Their Approaches on Other Team Members.

Reviews are intended to find errors, not to define approaches. For example, in reviewing a program design, the purpose is to identify its problems and shortcomings, not redo it. In some cases, all team members may agree that a different approach would be better and that the deliverable would be improved as a result. But some team members, particularly those with seniority, may use reviews to impose their preferred approaches on others. In such cases, reviews will dissolve into acrimony or team members will cease to regard comments seriously.

Actions

This problem is often hidden because most team members will not complain to you. One way of uncovering it is to examine the review worksheets and look for a preponderance of general comments. Where you find such a trend, you probably have an overbearing reviewer.

Do some detective work. Ask team members where the general comments came from. Look for the one or two people who seem to be at the center of the problem.

Meet with the entire team and reinforce the purpose of reviews and the types of comments that should be made. Remind them that general comments or intimidation are off limits.

Meet with the offenders and make it clear what you expect from them as reviewers.

Attend a few reviews as an observer, but be prepared to step in if the review does not stay on track. Justify your presence by pointing out that the reviews are important and you want to ensure that all participants understand their roles.

The Work of a Team Member is Riddled With Errors.

Since the purpose of a review is not to evaluate performance, you will find it hard to justify intervening. However, as the project manager, you really do not have to justify yourself, and, in certain circumstances, you can be inconsistent. If the offending team member's work is as poor as it seems, the rest of the team will thank you for taking action.

Actions

Meet privately with the team member. Acknowledge that although the purpose of a review is not to evaluate performance, you cannot ignore the results you are seeing.

Point out that the team member seems to be relying on the review process to identify rather than screen for errors before the review.

Identify the kinds of review comments that concern you and that you believe should have been caught before the review.

Set expectations for future reviews in terms of number and quality of comments.

Comments

Popular posts from this blog

The Conversion Cycle:The Traditional Manufacturing Environment

The Revenue Cycle:Manual Systems

Nassi-Shneiderman charts