Skip to Content (custom)

Planning for, Facilitating, and Evaluating Design Effectiveness

Publication No
RR233-11
Type
Academic Document
Publication Date
Dec 01, 2007
Pages
225
Research Team
RT-233
DOCUMENT DETAILS
Abstract
Key Findings
Filters & Tags
Abstract

Design Effectiveness is the degree to which the design effort helps in achieving project value objectives. Simply put, effective design enhances project value. Design management fits within the larger framework of project management, and Design Effectiveness fits within the context of design management. Because Design Effectiveness largely exists within the context of the design phase, considerations on other project phases, such as front-end development or construction management, have been excluded from this study. Practices that promote Design Effectiveness are called Design Effectiveness Practices (DEPs), and the primary aim of this research was to develop a method for identifying suitable DEPs on a given project.

The research consisted of three segments. The first segment of the research consisted of an ANOVA analysis of the CII benchmarking database. The aim was to analyze the effect of Design on project performance metrics, with 6 dependant “Project Performance metrics” and 7 independent “Design” metrics. Due to the limitations of the database, the results of the analysis were limited. However, the most notable result was the inverse relationship between “cost investment in design and pre-project planning” and “budget factor” among the contractor data set.

The second (and main) segment of the research was the development and validation of a Design Effectiveness Practices Selection Tool. The tool determines the priority of application of 30 different DEPs on a project given the project’s desired benefits (from 11 Project Value Objectives), design phase, and unique characteristics. All the 30 DEPs were correlated with the three input parameters using Objectives Matrixes with the aid of expert opinion. The Objectives Matrixes produced a score for each of the input parameters, and the three scores were combined to form a Composite Index Score for each DEP. A high DEP score indicates a DEP is highly recommended for implementation, a medium score means it might be beneficial for implementation, while low score indicates that the DEP is not very recommended. The main purpose of the tool was not to provide an absolute list of DEPs to implement, but to encourage discussion among the design team regarding DEP applicability using a structured methodology.

The Selection tool was also validated in a two-step process. The first process involved comparing the results of a Manual Selection Process (that utilized a similar but more basic approach used in the tool) to the results of the Selection Tool given the same inputs. The comparison was preformed on the basis of a top-10 match rate and a 4-ranks’ consistency rate. Of 12 participants selected for this survey, 6 completed this process. The second part involved a phone interview with the participants to clarify the differences between the rankings analyzed earlier. 5 of the 6 remaining participants completed this process. The results indicated that the users disagreed with the appropriateness of a DEP 14% of the time, and disagreed with the rankings within the top-10 only 32% of the time. The responses indicated that the tool was highly appreciated for its presentation of a second opinion to the team, although some participants found it limited for application on very small and specialized projects.

he third segment of this research involved the development of a Design Effectiveness Evaluation Tool. This tool divides each of the 11 Project Value Objectives (PVOs) into sub-criteria, and asks the user to evaluate each sub-criterion’s degree of applicability on the project. The tool also asks the user to identify the phase on the project during which the evaluation, as some sub-criteria would be inapplicable during certain phases. The Tool combines the timing phase and sub-criteria score to produce a score for each PVO. The PVO scores are then combined to form a composite Design Effectiveness Evaluation Score for the project.

This research report includes appendixes with support information that were used during the research. These include: a descriptive catalog of the 30 Design Effectiveness Practices, Application Tools Manuals, a listing of critical Design Quality/Productivity Drivers, validation survey forms, and detailed practice application descriptions on three DE practices (Standard Design Delivery Process, Design Productivity Tracking, and Design for Constructability).

The main conclusions of this study were: investment in design and pre-project planning tends to lead to lower budget over-runs, the DEP Selection Tool was developed to recommend a listing of DEPs for the design team to discuss, the tool’s validation indicated that it achieves its intended purpose, the information provided by the tool should be shared on an organization level to encourage discussion, and the key to design effectiveness lies in proper design management. Some of the recommendations provided by this study include: the further refinement of the DEP selection tool, updating the tool in the future as new DEPs surface, and the collection of more design-related metrics in the CII benchmarking database.

Key Findings

RR233-11, Design Effectiveness Evaluation Tool

An automated tool created to assist in the evaluation of design effectiveness in the context of capital facility projects. The purpose of this tool is to provide guidance in assessing how well a project is meeting desired objectives and criteria for design effectiveness. This tool computes a design effectiveness performance score based on these 5 project inputs: (RR233-11, p. 79)

  • Timing of DE evaluation
  • Relative importance of 11 different PVOs
  • Selected or screened sub-criteria associated with each PVO
  • Assessments of individual sub-criteria
  • Significance weightings of evaluation sub-criteria associated with targeted PVO
Filters & Tags
Research Topic
Planning for, Facilitating & Evaluating Design Effectiveness
Keywords
Design Effectiveness, Maximizing Engineering Value, Project Value Objectives, Design Effectiveness Evaluation Tool, Value through Design, rt233