Overview > Guidelines > Implementation: Reviews

Topics

General To top of page

  • Conduct reviews in a meeting format, although the participants of the meetings might prepare some reviews on their own.
  • Continuously monitor quality during the process activities to prevent large numbers of defects from remaining hidden until the reviews. In each activity in the Unified Process for EDUcation (UPEDU), the checkpoints listed below are referenced to reinforce this; use them for informal review meetings or in daily work.

Types of Reviews To top of page

In a 1990 standard glossary, IEEE defines three kinds of reviews:

Review
A formal meeting at which an artifact, or a set of artifacts are presented to the user, customer, or other interested parties for comments and approval.
Inspection
A formal evaluation technique in which artifacts are examined in detail by a person or group other than the author to detect errors, violations of development standards, and other problems.
Walkthrough
A review process in which a developer leads one or more members of the development team through a segment of an artifact that he or she has written while the other members ask questions and make comments about technique, style, possible error, violation of development standards, and other problems.

When implemented across teams, reviews also provide opportunities for people to discover design and code from other groups, and increase the chances of detecting common source code, reuse opportunities, and opportunities for generalization. Reviews also provide a way to coordinate the architectural style among various groups.

In the UPEDU, reviews play an important though secondary part in assuring quality. The major contributors to quality in the UPEDU are well described in [ROY98] in the section on Peer Inspections. However, this book does identify a valuable additional effect of reviews in professional development: junior staff have the opportunity to see the work of experts, and have their own work reviewed by senior mentors.

Planning To top of page

We plan reviews to determine the focus and scope of the review, and to make sure all participants understand their role, and the goals of the review.

Prior to the review, define the scope of the review by determining the questions that will be asked; define what will be assessed and why? See the Check-points for the artifacts to be reviewed for the types of questions that could be asked. The exact questions will depend on the phase in the project: earlier reviews will be concerned with broader architectural issues, later reviews will be more specific.

Once the scope of the review has been determined, define the review participants, the agenda, the information that will be required to perform the review. In selecting the participants, establish balance between software architecture expertise and domain expertise. Clearly and unambiguously designate an assessment leader who will coordinate the review. If necessary, draw upon other teams or other parts of the organization to supply domain or technical expertise.

The number of reviewers should be approximately seven or less. If chosen appropriately, they will be more than capable of identifying problems in the architecture. More reviewers actually reduce the quality of the review by making the meetings longer, making participation more difficult, and by injecting side issues and discussion into the review. Fewer than 4 reviewers increases the risk of review myopia, as the diversity of concerns is reduced.

Reviewers should be experienced in the area to be reviewed; for use cases, reviewers should have an understanding of the problem domain; for software architecture a knowledge of software design techniques is also needed. Inexperienced reviewers may learn something about the architecture by participating, but they will contribute little to the review and their presence may be distracting. Keep the group small; no more than seven people and no fewer than three. Fewer reviewers jeopardize the quality of the review, and more reviewers prevent interactive discussion essential to achieving quality results.

Select reviewers appropriate for the material:

  • those who have the background to understand the material presented
  • those who have an active stake in the quality of product or artifact being reviewed

Preparation To top of page

Prior to the review, the artifacts to be reviewed and any background material should be gathered and distributed to the review participants. This must be done sufficiently in advance of the review meeting for reviewers to review the material and gather issues. Distributing review materials sufficiently in advance, and allowing reviewers to have time to prepare for the review significantly improves the quality of review results. Preparation for reviews also greatly improves the efficiency and effectiveness of the review.

Reviewers should study the documentation, forming questions and identifying issues to discuss, prior to the review. Given normal workload of reviewers, a few working days is usually the minimum time needed to prepare for the review.

Conducting Reviews To top of page

There are several keys to conducting a successful review:

Each of these is discussed in detail below.

Understand the review process To top of page

In general, the review process follows a repetitive cycle:

  • An issue is raised by a reviewer
  • The issue is discussed, and potentially confirmed
  • A defect is identified (something is identified as needing to be addressed)
  • Continue until no more issues are identified

In order for this to work effectively, everyone must understand that the goal of a review is to improve the quality of the reviewed artifact. The artifacts should be reviewed with a critical eye to finding problems. Doing this can be difficult, so all reviewers must constantly remind themselves to focus on identifying issues (we are all natural problem solvers, but as reviewers we must put that aside).

We all have strong ownership of our work; it is often difficult to accept criticism, even when it is constructive. As a result, we must work even harder to focus on the goals of the review: to make that work better.

Understand reviewer roles To top of page

In order to conduct an effective review, everyone has a role to play. More specifically, there are certain roles that must be played, and reviewers cannot switch roles easily. The basic roles in a review are:

  • the moderator
  • the recorder
  • the presenter
  • reviewers

The moderator makes sure that the review follows its agenda and stays focused on the topic at hand. The moderator ensures that side-discussions do not derail the review, and that all reviewers participate equally.

The recorder is an often overlooked, but essential part of the review team. Keeping track of what was discussed and documenting actions to be taken is a full-time task. Assigning this task to one of the reviewers essentially keeps them out of the discussion. Worse yet, failing to document what was decided will likely lead to the issue coming up again in the future. Make sure to have a recorder and make sure that this is the only role the person plays.

The presenter is the author of the artifact under review. The presenter explains the artifact and any background information needed to understand it (although if the artifact was not self-explanatory, it probably needs some work). It's important that reviews not become "trials" - the focus should be on the artifact, not on the presenter. It is the moderator's role to make sure that participants (including the presenter) keep this in mind. The presenter is there to kick-off the discussion, to answer questions and to offer clarification.

Reviewers raise issues. It's important to keep focused on this, and not get drawn into side discussions of how to address the issue. Focus on results, not the means.

Have a moderator To top of page

As discussed above, the moderator plays a crucial role in keeping the review from losing focus. It's important that the moderator be focused on keeping the review on track; the moderator should not have reviewer responsibilities. The role of the moderator is to elicit discussion, ensure equal participation, and to defuse contention. This is a full-time task. Failure to moderate effectively causes reviews to drag on beyond their intended conclusion, and to fail to achieve their goals.

Keep the review meetings brief To top of page

Reviews are most effective when they are brief and focused on well-identified objectives. Because it is difficult to maintain focus for long periods, and because reviewers have other work to do as well, limit reviews to no more than two hours. If a review is expected to go longer, break it into several smaller and more focused reviews. Results will be better if reviewers can maintain focus.

The key to doing this is to have a well-identified agenda and clearly articulated goals. These should be communicated when the review materials are distributed, and the moderator should reinforce them when the review meeting begins. The moderator must then consistently (and sometime ruthlessly) reinforce these goals during the meeting.

Identify issues, don't fix problems To top of page

One of the major reasons why review meetings fail to achieve their intended results is that they have a tendency to degenerate into discussions of how a problem should be fixed. Fixing problems usually requires investigation and reflection; the format of the review is not an effective medium for this kind of discussion. Once the issue is identified, determine if it is a defect that must be resolved, and then assign it to someone to investigate and resolve. The review meeting should focus on identification only.

If the issue requires further discussion among a group of people, form a separate meeting to focus on that topic. Typically this meeting will require some investigation and preparation, and people with the right skills will need to be involved. The review should remain focused on identifying other issues. The moderator will often need to exert considerable will to keep the review meeting focused on this.

Taking Action on Review Results To top of page

The review is of little value if nothing comes of it. At the conclusion of the review:

  • Prioritize the list of problems.
  • Create defects to track the problems and their resolution.
  • If additional investigation is required, assign a small team to research the problem (but not to solve it).
  • For problems that can be resolved in the current iteration, assign a person or team to fix the problem.
  • Feed the list of unresolved problems into future iteration planning efforts.

More information To top of page

See also [MCO97].

Feedback © 2014 Polytechnique Montreal