Medical Construction & Design

JAN-FEB 2015

Medical Construction & Design (MCD) is the industry's leading source for news and information and reaches all disciplines involved in the healthcare construction and design process.

Issue link:

Contents of this Issue


Page 51 of 62

associated points, scoring methods and ranking process) were described in detail in the request for proposal documents. The facility owner's intents were further communicated in a manual provided for self-scoring, a series of collaborative meetings and also in a request for clarifi cation process. After submission, several multidisciplinary subcommit- tees conducted an evaluation separately on technical compli- ance specifi cations and scored elements. Proposals were fi rst checked for compliance specifi cations. Only the compliant proposals were evaluated using three scoring methods: direct measurement of drawings, comparison against defi ned criteria and subjective evaluation using rating scales. The majority (88 percent) of the scored elements were objectively evaluated with pre-determined measurement procedures. The relatively sub- jective elements were scored on a structured rating scale. An audit committee verifi ed the scoring by rescoring a subset of the elements. Higher reliability was found for the objec- tive methods. Finally, the points earned by each proposal were translated into a credit to reduce the net present cost of building construction and maintenance, which was used to determine the fi nal ranking of proposals (i.e., proposals with lower net present cost were ranked higher). There were two types of cost-benefi t analyses conducted for the design evaluation and submission process. The fi rst hap- pened with the owner establishing the scored elements and the value of each point: "cost" as the premium paid for design improvements and "benefi t" as the operational savings due to anticipated improvements in healthcare outcomes. The second cost-benefi t analysis occurred with the bidding team choosing specifi c design features: "cost" as the extra construction and maintenance cost of selected design features and "benefi t" as the competitive advantage gained in the reduced net present cost. Impact on design The empirical evaluation process signifi cantly impacted the bid preparation process, especially for the designers. The winning team went through multiple design optimization iterations to maximize the points scored for the lowest capital and mainte- nance costs. During the process, it became clear that the scored elements varied in terms of the fi nancial reward for ef ort expended; some scored elements were perceived to be more dif- fi cult to achieve due to higher cost per point gained. Because of the increased transparency and objectivity, the bidding teams were able to self-score and choose design features with a cost lower than the expected credit. Compared with a similar project with typical compliance specifi cations, the design team went through signifi cantly more design iterations — possibly due to the newness of the process and the added complexity. The new evaluation process drove design teams to optimize design to the greatest extent, but also challenged design teams to more ef ciently fi nd the best solution to address key goals of facility owners. The process of self-scoring seemed time consuming, labor intensive and caused delays in giving feedback for subsequent rounds of design iterations. Perceived benefi ts After the completion of the evaluation process, a third-party research team (not involved with the bid-evaluation process) in- terviewed the facility owner, consultants and the winning team to document perceptions and further inform improvement and applications of the empirical evaluation process. Overall, the process was perceived to result in better design results with some limitations: Improved design quality. The relative importance of various organizational goals was more clearly articulated in a quantifi - able manner and better communicated to design teams up front. The new process was ef ective in encouraging design teams to improve quality by using research evidence for issues recog- nized as strategic drivers by the facility owner. Expectations: Most stakeholders expected better healthcare outcomes and overall savings in post-occupancy operations as a result of the process, even though confi dence level varied across the scored elements (e.g., more certain about benefi ts of travel ef ciency than natural daylight). It was also recognized that bet- ter outcomes might not automatically translate into long-term operational savings due to the interface with other impacting factors (e.g., changes in operational policies, model of care). Involvement: Architects and other designers believed their roles in developing solutions increased. However, they perceived a need to improve the design work process to more ef ectively address the new evaluation process. Room for improvement According to the stakeholders, improvement of the evaluation process might focus on the following aspects: The collaborative interaction process: Provide clearer mes- saging of design requirements and compliance, faster turnaround of request for information responses and rationale and research evidence behind the scored elements collected by the owner. Scored elements: Reduce the number of key elements. Too many scored elements may dilute the incentive and result in sig- nifi cant burden on design teams. Supportive research: Produce more research to support decision-making. The availability of supportive research for dif er- ent evaluation elements varied signifi cantly. Relatively subjective design elements: Further reduce the weight of these elements (such as aesthetics), which were associ- ated with less reliable scoring despite an ef ort to make it more objective and transparent. Post-bid revisions: Reduce confl icts between stated goals and reality, which were found to result in changes following the contract award. The design evaluation process was an innovative way of advancing Evidence-Based Design in the pre-design phase and improving project quality by connecting design to operational issues. Although it was tested on one project with specifi c design, lessons learned can be applicable to various healthcare building projects. Advancing this thought process in multiple contexts can start to shift thinking from facility design, con- struction and renovation as a "sunk cost" and start to frame projects as strategic investments with long-term implications over the life of the asset. Mike Marasco is CEO of Plenary Concessions at The Plenary Group. Leslie Gamble is clinical coordinator of capital planning and projects at the Interior Heart Surgical Centre. Xiaobo Quan, Ph.D., EDAC, is senior researcher at The Center for Health Design. Editor's Note: This article is partially based on fi ndings from a case study conducted by The Center for Health Design. MCDM AG.COM | JA N UA RY/ F EBRUA RY 2015 | Medical Construction & Design 47

Articles in this issue

Links on this page

Archives of this issue

view archives of Medical Construction & Design - JAN-FEB 2015