Skip to main content

Integration

Addressing Five Key Questions to Avoid Project Cost Overruns

Professor Bent Flyvbjerg is a renowned expert in the field of project predictability and project cost overruns. He recently co-authored an article entitled “Five key questions about cost overrun.”

According to the article, good practice entails: (a) Consistent definition and measurement of cost overruns; in contrast to mixing inconsistent baselines, price levels, etc. (b) Data collection that includes all valid and reliable data; as opposed to including idiosyncratically sampled data, data with removed outliers, non-valid data from consultancies, etc. (c) Recognition that cost overruns are systemically fat-tailed; in contrast to understanding overrun in terms of error and randomness. (d) Acknowledgment that the root cause of cost overrun is behavioral bias; in contrast to explanations in terms of scope changes, complexity, etc. (e) De-biasing cost estimates with reference class forecasting or similar methods based in behavioral science; as opposed to conventional methods of estimation, with their century-long track record of inaccuracy and systemic bias. He characterizes bad practice as violating at least one of these five points.

Where to Start

Making good practice an actuality rather than just a theory can be daunting, resource intensive, and costly. Conversely, all you need to perpetuate bad practice is an Excel spreadsheet and an Excel enthusiast! To enable good practice, you need the right mix of people, processes, and enabling technology – all of which require investment. Obtaining investment requires executive commitment. This means you need a clear understanding of the benefits and an unwavering belief you can achieve them. The good news is that it doesn’t have to happen all at once. To start, select a platform you can grow into, not grow out of. Then, move at the pace best suited to your priorities and organization.

Here, I’ll explain how investment in the right Enterprise Project Performance (EPP) technology will enable successful implementation of the five best practices covered in Professor Flyvbjerg et al’s paper.

Consistent Definition of Cost Overruns

(a) Consistent definition and measurement of cost overruns; in contrast to mixing inconsistent baselines, price levels, etc.

Perhaps the easiest of the five to overcome; if you want to succeed, kill the baseline. Calculating cost overruns of many projects using dissimilar baselines creates the opportunity to distort the picture. Switching to a transactional approach to estimating and forecasting ensures overruns are measured from the first estimate, whenever created and for whatever reason. Adjustments to subsequent estimates and forecasts must not involve deleting prior transactions.. Time-slicing this transaction set retains the ability to review a “baseline” at any point in time. This method and resultant data set supports timeliness metrics in addition to outcome metrics, i.e. the ability to see when the outcome was known, not just what it was (see point (c) for elaboration on these timeliness metrics).

None of this diminishes the importance of the points in time when funding was approved on the basis of estimated benefits, it merely switches the focus from outcomes to forecast timeliness and its ability to influence cost reduction. Reducing the cost of projects is the ultimate goal, not understanding whether the information presented at the time the project investment is accurate. Without cost reduction, you can compare such analyses to shuffling deck chairs on The Titanic, i.e. it doesn’t matter what you prove, we’re all going down.

Integrated Data Collection is Key

(b) Data collection that includes all valid and reliable data; as opposed to including idiosyncratically sampled data, data with removed outliers, non-valid data from consultancies, etc.

It is important to integrate core data sets natively (estimates, budgets, changes, risks, etc.) or automatically (commitments, actuals, timesheets, etc.). If predictions are not Powered by Digital Transformation, progress measurement is open to corruption by human intervention. Data housed in design, engineering, procurement, and construction management systems must be automatically incorporated into productivity/earned value analyses on a regular schedule. Don’t allow the results to be overridden or massaged. Make decisions based on the quantitative data in addition to qualitative understanding and experience.

Limit Cost Overruns with Systematic Improvements

(c) Recognition that cost overrun is systemically fat-tailed; in contrast to understanding overrun in terms of error and randomness.

Systemically = relating to a system, i.e. the system of people, processes, and technologies through which projects are planned and executed. My least favorite expression in our industry is: a fool with a tool is still a fool. First of all, how demeaning to the professionals in our industry! More pertinently, the phrase diminishes the role of technology in project systems, which completely misunderstands the role of technology in project systems.

Technology only succeeds when it empowers skilled, experienced people to efficiently employ well-defined processes. Which is to say that the project systems trinity can only be successful when employed in equal measure. The key is to deploy a full lifecycle platform across the enterprise and to forever banish the use of Excel, homegrown systems, and commercial point solutions. These are the productivity killers of the engineering and construction world. They are perpetuated by spreadsheet and database enthusiasts resistant to change, or unreasonably skeptical of enterprise technology vendors.

In 2018, the adoption of a two-enterprise system strategy, ERP and EPP (Enterprise Project Performance) – powered by a variety of best-in-class design, engineering, procurement, and construction management tools – is the only option. An EPP platform carries a project from its inception through evaluation, prioritization, planning, estimating, scheduling, funding/budgeting, contract/change/risk/issue management, performance (progress/productivity/earned value) management, forecasting, close out, and benchmarking. Only the erstwhile lack of availability of such technology has prevented companies adopting such an obvious strategy. But in 2018 there are no longer any excuses for not making the necessary investments in systemic improvements. There is never a bad time to do it.

Change Behaviors to Reduce Cost Overruns

(d) Acknowledgment that the root cause of cost overruns is behavioral bias; in contrast to explanations in terms of scope changes, complexity, etc.

According to Professor Flyvbjerg: “The root cause of cost overrun, according to behavioral science, is the well-documented fact that planners and managers keep underestimating scope changes and complexity in project after project.”

CII Research Team 291 stresses the importance of adopting predictability metrics to change the emphasis from outcomes to timeliness. When an organization measures and incentivizes performance based on the timeliness of its forecasts, stakeholders are motivated to accurately estimate and reveal the true predictions sooner. This allows time for corrective strategies not typically afforded by an outcome-centric focus. RT-291’s metrics are normalized on percentage scales. This means you can compare any number of projects, regardless of type, size, location, time-period, or any other factors. Normalization also supports analyses by any dimension, allowing institutionalized trouble spots to surface, whether by region, industry sector, business line, project type, project team, or even individual team member. Timeliness correlates with competence and/or transparency because it takes effective people, processes, and technology to provide accurate forecasts. Additionally, you need cultural integrity to communicate the full picture, thereby enabling timely, cost reducing corrective actions.

Connecting Estimating with Comprehensive Reference Data

 (e) De-biasing cost estimates with reference class forecasting or similar methods based in behavioral science; as opposed to conventional methods of estimation, with their century-long track record of inaccuracy and systemic bias.

Many organizations lack comprehensive cost databases because they control projects in Excel spreadsheets or limited, inflexible database applications. Controlling every project in an adaptable Enterprise Project Performance (EPP) platform that incorporates key attributes and measurements supports semi-automated collation of reference data. This data is reusable on future project estimates within the same platform. As described in item (c), timeliness and outcome metrics can also be captured automatically, supporting correlation with systemic issues that must be addressed prior to commencing the next project.

Perhaps the least encouraging, though most entertaining, aspect of Professor Flyvbjerg et al’s paper is the raging war with Love and Ahiaga-Dagbui. That said, my favorite Flyvbjerg rebuttal is “We have on our side…Nobel-Prize-winning theory on heuristics and biases…” Perhaps I should tread carefully! In section 3, table 1, there’s an essentially puerile debate on who had the largest dataset, among other arguments; ruminating on whether 383 or 258 is the ideal number of projects in the statistical sampling (more deck chair shuffling).

But consider this: the industry leading Enterprise Project Performance (EPP) technology provider has customers with in excess of 50,000 projects in one database. Across all of its customers there exists more than one million projects. Once collated via automated cloud orchestration – not without its challenges but eminently achievable – it will be the largest dataset of reference projects in the history of mankind. The resulting findings will be so statistically significant as to border on absolute!

Conclusion

Finally, despite my appreciation of the article and respect for Professor Flyvbjerg, the continued focus on outcome variance analysis (overruns) without consideration of the timeliness of predictions is concerning. I advocate the adoption of CII RT-291’s timeliness metrics as a means of reversing this trend.  and positively modifying organizational culture and behavior to better support cost reduction strategies.

For more information, here’s a link to a recent webinar I presented on the “Pillars of Project Predictability, further expanding on how to deliver these best practices within your organization, including a demonstration of RT-291’s timeliness metrics.

About the Author

Mark White has almost 30 years’ experience in quantity surveying, project management and controls, and technology innovation. He enjoys helping companies develop strategies to achieve the ultimate goals of highly efficient predictability and control for projects portfolios. Mark is currently the Senior Vice President with global responsibility for EcoSys product strategy at Hexagon PPM, and has been successful in helping organizations improve Enterprise Project Performance (EPP) through EcoSys. Previously, Mark was VP of Strategic Accounts at EcoSys. Before that, he wasd CIO/SVP at Faithful+Gould, a world-leading project management and controls consultancy, having founded their IT Consulting business unit and project controls software.