Accounting for Uncertainty in Asset Performance

Guide Home / 4. Asset Performance / 4.3 Managing Assets Over Their Life Cycles / 4.3.1 Managing Assets Using Condition Based Management

Accounting for Uncertainty in Asset Performance

Performance modeling uses historic data to estimate future performance; however, not all future events are predictable nor is past performance necessarily a predictor of future performance. This section considers how uncertainty can be introduced into the analysis.

The unpredictability of future events introduces uncertainty into prediction models. Additionally, the amount of uncertainty tends to increase with time so their affects are compounded. As outlined in the previous section, probabilistic modeling is one approach that can be used for accounting for uncertainty, but what level of uncertainty is acceptable?

To minimize uncertainty, an agency must first understand the source of the uncertainty. A common type of uncertainty related to asset management is the behavior of the assets themselves. Due to the advancement of technology and knowledge and differences in materials and construction practices, there can be significant differences in performance between otherwise similar assets. The change in behavior can be positive, such as the introduction of epoxy-coated reinforced steel in bridge decks to delay the onset of corrosion from road salt intrusion or the introduction of Superpave and performance graded asphalt binders to reduce pavement cracking and rutting. Other changes in behavior are less easy to predict, such as the impact of salt intrusion on prestressed, post-tensioned concrete box-beam bridges. Other sources of uncertainty include:

  • Weather events, e.g. flooding, drought, or freeze-thaw
  • Earthquakes
  • Climate change
  • Traffic accidents
  • Data inaccuracies
  • Inaccurate models
  • Poor assumptions

Uncertainty caused by variability in the data can often be addressed through the development of quality assurance plans that describe the actions an agency has established to ensure data quality, whether the data is collected in-house or by a contractor. Common quality assurance techniques include documented policies and procedures to establish data quality tolerance limits, independent reviews of collected data, and training of data collection crews. Data management strategies are discussed in more detail in Chapter 7.

To evaluate the accuracy of models and assumptions, agencies can include multiple scenarios in their life cycle planning analysis to test the impact of different decisions. This type of sensitivity analysis can be helpful in identifying areas in need of further research or developing contingency plans if the initial assumptions turn out to be inaccurate.

To understand whether time and effort should be invested in minimizing uncertainty, a risk-based approach can be used. Assuming the consequence arising from a defined issue or event remains the same, the cost in terms of data collection of reducing uncertainty can be investigated. As an example, the condition state of an asset, as determined using a visual approach, may not provide the required level of insight, which results in poor or unknowable treatment decisions. To minimize the uncertainty, extra testing can be carried out. The level of testing would be defined by the risk-cost reduction ratio. Similarly, with climate change, how much would have to be invested in studies to understand the effects on asset longevity? Thus, through risk management, an agency determines which risks are tolerable and which must be actively managed through investigations, studies, and other research. The risks are identified, prioritized, and tracked using a risk register (see Chapter 2). For those risks that should be managed, plans are developed to outline actions that will be taken to mitigate threats or take advantage of opportunities, as discussed in Chapter 6.