This page features information on managing risk. Whether you are folding risk into the resource allocation process, evaluating performance measures with respect to risk, or assessing potential program adjustments, you will find what you need here.
Asset management utilizes performance management to set objectives, define measures, establish targets, and monitor results. Transportation Performance Management (TPM) relies on the TAM principles and process to help achieve the agency’s broader goals and objectives.
Relationship to Federal TPM Activities
The MAP-21 Act (2012) established a performance-based program intended to focus Federal Aid highway program and public transportation system (e.g., bus, light rail, and ferry) investments on national transportation goals. It was also intended to increase accountability and transparency in the use of federal transportation funds, as well as improve project decision-making through the strategic use of system performance information. The performance-based provisions of MAP-21 were retained in the FAST Act in 2015.
TPM is defined by FHWA as a strategic approach to making investment and policy decisions to achieve national performance goals using system information in accordance with rules established by the Department of Transportation (see Figure 2.3). The FHWA recognizes asset management as the application of TPM to manage the condition of infrastructure assets needed to provide for mobility and safety in the nation’s transportation system. In short, the FHWA refers to asset management as the engine driving infrastructure performance.
Figure 2.3 FHWA’s Strategic Approach to TPM
Source: FHWA TPM Homepage. 2019 https://www.fhwa.dot.gov/tpm/
Asset management plans document the processes and investment strategies developed by an agency to manage its infrastructure assets. These asset management plans support an agency’s performance-based planning and programming processes for making long-term investment decisions and feed shorter-term project and treatment selection activities. Together, these activities ensure the investment decisions of an agency are aligned with performance objectives and goals.
The TPM provisions for highways included in federal law are implemented in accordance with rulemakings organized around the following six elements:
- National goals – focusing the Federal Aid highway program on the seven areas listed below:
- Congestion reduction
- System reliability
- Environmental sustainability
- Freight and economic vitality
- Infrastructure condition
- Reduced project delivery delays
- Measures – assessing performance or condition in carrying out the TPM-based Federal Aid highway program
- Targets – funding recipients are required to document future performance expectations under a fiscally-constrained environment
- Plans – identifying strategies and investments for addressing performance needs
- Reports – documenting progress toward target achievement and investment effectiveness
- Accountability and transparency – requiring federal funding recipients to achieve or make significant progress toward targets
TPM Relationship with TAM
There is a close relationship between TPM and TAM, since both consider asset and system performance, risks and available resources to achieve desired objectives over time. Both rely on a strategic approach, using data to make investment and policy decisions in order to achieve performance objectives. Internationally, there is less distinction between asset management and performance management, with the IAM defining asset management as encompassing the “balancing of cost, opportunities and risks against the desired performance of assets to achieve the organizational objectives.” In the United States, TAM applies to the technical and financial decisions, plans and actions related to physical infrastructure, while TPM considers a broad range of system performance categories.
A graphic illustrating the integration of asset management and performance management is provided in Figure 2-4. In the figure, the circle on the left represents the interconnection of the various performance areas that transportation agencies are concerned with throughout their planning processes. Flowing into the performance circle is the asset management circle, representing an agency’s infrastructure needs to support system performance.
The FHWA’s Expert Task Group (ETG) published a white paper explaining the relationship between asset management and performance management. It acknowledges the performance of a transportation system is dependent on many factors, including operational characteristics, and system usage and demand, in addition to the physical condition of the infrastructure assets. The paper explains that “performance management focuses on how policies, resource allocation, and other decisions affect all aspects of system performance including safety, operations, environmental stewardship, and infrastructure condition.” (FHWA 2012) Asset management is described as an application of performance management principles with a long-term focus to manage the performance of infrastructure assets, the resources allocated to operate a transportation system, and the investments made to achieve the agency’s long-term goals and objectives.
Figure 2.4 Integration of Performance Management and TAM
Source: NHI 136106A, Introduction to Transportation Asset Management. 2019
To support the alignment of agency policies, objectives and day-to-day practices, the Province of British Columbia established the tiered structure shown in Figure 2.5 for a design-build-finance-operate project. The highest of the three levels, Key Performance Measures, defines the high-level outcomes for service delivery in terms of a few key strategic areas. The second level, Asset Preservation Performance Measures, defines the minimum acceptable condition levels for each of the individual assets to preserve their value. The third level, Operational Performance Measures, corresponds to the many specific requirements for operating and maintaining the highway in a safe manner on a day-to-day basis. The tiered approach helped align stakeholders at all levels and clarified priorities for all parties.
Performance Management Framework
To support the alignment of agency policies, objectives and day-to-day practices, agencies may establish a tiered performance management framework, such as the example illustrated below for a model Design-Build-Finance-Maintain-Operate (DBFMO) project (Figure 2.5). The highest of the three levels, Key Performance Measures, defines the high-level outcomes for service delivery in terms of a few key strategic areas. The second level, Asset Preservation Performance Measures, defines the minimum acceptable condition levels for each of the individual assets to preserve their value. The third level, Operational Performance Measures, corresponds to the many specific requirements for operating and maintaining the highway in a safe manner on a day-to-day basis.
Further discussion on Performance Management Frameworks, defining Performance Measures and Performance Targets is included in Chapter 6.
Managing transportation assets entails managing risk. This includes day-to-day concerns, such as addressing the risk that assets will deteriorate faster than expected or projects will cost more than budgeted. However, managing risk also involves enterprise-level risks with widespread impacts.
FHWA defines risk and risk management, in the context of transportation asset management, as follows:
- Risk: The positive or negative effects of uncertainty or variability upon agency objectives. (23 CFR 515.5)
- Risk Management: The processes and framework for managing potential risks, including identifying, analyzing, evaluating, and addressing the risks to assets and system performance. (23 CFR 515.5)
Considering risk is important in developing TAM strategies, because transportation agencies often must spend significant resources responding to and/or mitigating risks. Reacting to the uncertainty presented by risks can be more expensive than proactive management. Risk management strengthens asset management by explicitly recognizing that any objective faces uncertainty, and by identifying strategies to reduce uncertainty and its effects. Being proactive, rather than reactive, in managing risk and avoiding “management by crisis,” helps agencies best use available resources to minimize and respond to risk as well as further build public trust.
Given the importance of risk management for supporting asset management, agencies should formally identify and manage risks at all organizational levels. Figure 2.6 shows four levels at which risks can be identified within an agency, and the individuals who may be responsible for the risks at each level.
Typically agencies manage risk every day. They are well-equipped to hand risks at the project and activity levels, and regularly consider risks on a larger scale. Formally considering and documenting potential risks at all levels can help bring greater attention to them and improve risk management.
Figure 2.6 Levels of Risk within an Organization
TRB. 2016. NCHRP Project 08-93 Final Report. http://onlinepubs.trb.org/onlinepubs/nchrp/docs/NCHRP08-93_FullGuide.pdf
Risk Management Process
Figure 2.7 depicts a risk management process. While it may not be necessary to walk through each discrete step in this process for every risk an agency faces, this process is helpful for understanding how to incorporate risk into TAM.
- The process starts with establishing the context for risk management. In the case of risk management for a TAMP, the context is largely defined through other TAMP development steps.
- The second step involves identifying the risks that affect the assets in the TAMP. Ideally, in this step the agency considers the full set of asset-related risks, even those that may appear insignificant.
- The third step, risk analysis, involves identifying the cause of the risk, the outcomes or consequences (impact), and the likelihood of the risk occurring.
- The fourth step, risk evaluation, entails prioritizing and ranking risks.
- Fifth, the address risks step is the response the agency takes to the risk. DOTs can choose to tolerate the risk or treat the risk in some manner.
- The left side of the figure shows a continuous communication and consultation activity. Agencies need to communicate the risks to both internal and external stakeholders, as well as monitor and review the risks.
- The right side of the figure shows an iterative monitoring and review process. Once the risks are identified, analyzed, and a mitigation plan is in place agencies need to monitor the risks and update the risk management documentation accordingly.
- More on risk monitoring and management is discussed in Chapter 6 Monitoring and Adjustment.
- This process is generally consistent with ISO Standard 31000, as well as FHWA’s requirements for state DOTs to assess risks to NHS assets in developing a TAMP.
Figure 2.7 The Relative Timeframes Between Plans
Source: Adapted from FHWA. 2017. Incorporating Risk Management into Transportation Asset Management
Plans: Final Document. https://www.fhwa.dot.gov/asset/pubs/incorporating_rm.pdf
It is common practice to develop a register identifying major risks and assess each based on expert judgment. In this fashion, the process is valuable for identifying “non-programmatic” risks, or risks not previously addressed in any one program. The How-To Guide in this section describes the steps in developing a risk register to identify such risks. Once a risk has been identified and assessed, formal processes may be required to perform a more detailed assessment and manage the risk programmatically, as illustrated in the Arkansas practice example.
As part of the process of developing its 2018 TAMP, ARDOT developed a risk register and mitigation plan compliant with FHWA TAMP requirements. As part of this effort, ARDOT first reviewed and documented its existing controls for asset-related risks incorporated in its design specifications, and approaches for addressing specific risks to bridges (e.g., scour). The agency then developed an initial register through a risk workshop. In the workshop, ARDOT staff identified specific risks not otherwise addressed programmatically, classifying risks by type:
- Asset Performance
- External Threats
- Business Operations
- Highway Safety
- Project and Program Management
- Information and Decision Making
For each risk ARDOT used expert judgment to classify the risk in terms of its likelihood and impact. An initial priority was determined based on this classification. Next, ARDOT defined potential mitigation strategies for each of the 14 high-priority asset management risks in the register. A total of 12 strategies were identified, with each helping to mitigate one or more different risks. ARDOT next prioritized the mitigation strategies, and developed mitigation and monitoring plans detailing actions to be undertaken, and the approach for monitoring the risks and updating the register moving forward.
Arkansas DOT. 2018. ArDOT Risk-Based Transportation Asset Management Plan. http://www.tamptemplate.org/wp-content/uploads/tamps/037_arkansasdot.pdf
Planning and Programming, Performance Management and Risk management are activities that form components of the asset management framework within an agency. They are necessary to manage the infrastructure portfolio, and the services it supports.
Asset management relies on good data and tools to guide investment decision-making. Indeed many agencies have a wealth of data about their infrastructure, but are challenged to leverage information to make better decisions. Information management is the discipline that delivers foundational capabilities for asset management results. Asset management systems connect inventory and condition with analytical capabilities to predict asset condition under various funding and action scenarios. Other information and tools allow for the ability to relate asset actions across assets and with other transportation areas, such as safety and mobility. This section provides a brief overview of information management and how it supports the implementation of the concepts discussed in this guide. More detail can be found in subsequent Chapters. Each section has been crafted to illustrate how data, information and analysis can be leveraged to create better outcomes, and enable agencies to improve how they deliver services.
Data Collection Standards and Processes
Standards and processes for data collection are two important aspects of integrating asset management practices across the agency. Collecting a standard set of data elements for each asset ensures consistency, and better enables analysis and reporting across assets. Standard data elements can include a unique asset identifier, designated asset category and asset type. Geospatial referencing standards are also important. In order to see assets on a map and integrate them spatially, agencies need a standard way to locate them. It is also important to consider the data collection intake process. Before data is collected, agencies should determine if specific data already exists in order to prevent duplication. If the data does not exist and needs to be collected, agencies should consider how new data will integrate with what is available currently. This ensures the data is used in the most effective way possible. Finally, responsibility needs to be assigned to an Asset Data Steward who is responsible for ensuring data standards and processes are followed.
Asset Information Across the Life Cycle
TAM integration also relies on collecting and updating asset information across the life cycle of the asset. It is important to think holistically about the asset life cycle, from the initial design phase and through future maintenance and rehabilitation activities. Technologies and processes are becoming available to extract asset information from design and as-built plans to populate inventories. Many agencies have processes in place to think holistically about assets during the project scoping and design phase.
Agencies face challenges in integrating asset information across the life cycle of the asset, because there is often a disconnect between maintenance activities, planning/ programming and the assets. For example, maintenance divisions may not know about planned projects on particular assets that have been scheduled for repairs. Better linkage between the work an agency is planning for the future, the work they are doing currently and the general condition of the assets is important to cultivate. Maturing agencies are working hard to bridge this gap. Chapter 6 provides more information on updating asset information and connecting with maintenance activities.
Common Set of Asset Management Reporting Processes
Another aspect of information management strategy that can help integrate TAM across an agency is to develop a common set of asset management reporting processes. Many agencies are successfully mapping different types of assets and making this information available on a GIS portal. Typically, these portals have different layers for each asset. This is one example of a consistent process for sharing information about assets.
As agencies seek to make cross-asset tradeoffs and scope projects considering multiple types of needs, having a common set of reporting processes and consistency across different tools becomes even more important. An example of the challenge agencies face in doing this is seen in the TAMP development process. Developing a TAMP requires information about the needs of different assets. This information must then be communicated with a common set of definitions and combined with funding information. Practitioners have to be aware of the funding and cost assumptions used in every tool before they can report numbers in the TAMP. For instance, the pavement management system might only include costs for the pavement work, whereas other planning tools might incorporate guardrail costs and other costs related to the work. Different tools might also use different assumptions for inflation. In order to bring all this information together in a TAMP, agencies need to make sure their reporting and assumptions are consistent.
Ohio DOT (ODOT) has focused on data and information management improvements as a foundational element of their asset management program. As part of this they have strengthened their geographic information system (GIS) and linked it to over 80 data sets. The agency’s TIMS allows users to make collaborative decisions based on shared access to the same data sets.
Source: Ohio DOT. TIMS.https://gis.dot.state.oh.us/tims/
Assessing Current Practice
An assessment of current agency competency against industry-leading practice enables an agency to assess a desired future performance level. It can also help to identify the steps required to reach that goal.
TAM is an evolving process; ongoing improvement is an important component for a TAM program. In fact, the ISO 55001 Asset Management certification requires ongoing assessment and continual improvement.
A gap assessment process is used to understand how well an agency aligns with an established asset management framework. The gap assessment can be conducted internally or by a third party. Organizations seeking or wanting to maintain ISO certification will also undergo a formal third party audit.
The results of a gap assessment can help agencies identify changes in business processes needed to better link plans and decisions and better align to leading practice.
NCHRP Project 08-90 led to the development of a gap analysis tool, available through AASHTO and the TAM Portal. Figure 2.8 illustrates how this assessment tool is intended to be used. There are several other frameworks that can be used, including ISO 55001 and the Institute of Asset Management (IAM). A range of gap assessment framework’s are discussed further in Figure 2.9. Each framework, process or tool will enable an agency to assess current performance and, from this, identify a desired capability level.
Figure 2.8 TAM Improvement Cycle
Source: Modified from original in NCHRP Project 08-90
In some cases, agencies also seek benchmarks that reflect how peers are performing to help them decide on the level of maturity and complexity to which they should aspire. ISO 55001 trends away from this. It encourages agencies to check against a framework of practices and process, and select what is best for the agency. Chapter 6 addresses benchmarking and related topics.
Actions to close gaps between desired and actual performance should be addressed within a TAM improvement or implementation plan.
Undertaking a gap assessment can form an important part of a change management process by aligning those within the agency on current performance, opportunities and targets for improvement.
Table 2.1 – Frameworks for Assessing Current Practice
|Framework||NCHRP 08-90 Gap Analysis Tool||ISO 55001 Asset Management Gap Analysis||International Infrastructure Manual (IIMM)||IAM Self-Assessment Methodology|
|Background||This tool was developed based on the tool and process created through development of the 2011 AASHTO TAM Guide.|
Uses a point scale for evaluating current and desired capabilities.
|This is the most widely adopted standard for asset management globally. It is generic to accommodate many contexts. Describes a management system approach to asset management.|
|Recognizing that the ISO Standards for asset management are very much the “What to do”, the IIMM looks to provide the “How to do it”.|
Identifies an Asset Maturity Index (Aware, Basis, Core, Intermediate, Advanced) to identify the current and an appropriate level of asset management for each asset.
|As an aid to the application of ISO 55001, the IAM decided to update their methodology into one that enables organizations in all sectors to measure their capabilities against the requirements of both PAS 55 and ISO 55001.
|Assessment or Focus Areas||
|Why use this framework?||This framework is best for an agency that wants to work explicitly within a US-defined context that adopt wider influences. Since this tool can be fully customized by an agency, an agency that wants to tailor the analysis to their particular needs will find this useful. Finally, the tool facilitates the analysis of data, and can generate graphs and charts using the data imported into it.||This framework is ideal for agencies that want to adopt a world-recognized approach to asset management that provides a developed asset management lexicon. This is currently the most internationally-recognized standard in the world.||This framework has been refined over time with many examples that illustrate successful application of concepts by organizations. Public agency focused, and largely written for the asset management practitioner responsible for civil assets.||This standard is well recognized internationally, is infrastructure agnostic, and has applicability to infrastructure owners in both the private and public sector. It has many other resources developed along with the framework including training materials, reference guides and courses to upskill an agency.|
In 2016, Amtrak Engineering undertook an Asset Management Capability Assessment which bases maturity on the degree of formality and optimization of processes. The assessment uses several questions grouped into eight assessment areas, which describe operational processes necessary for asset management success. This maturity methodology is aligned with emerging guidance from the Institute of Asset Management (IAM), ISO 55001 standards, and requirements of the US FAST Act.
The assessment used a six-point scale, scoring Amtrak at the Establishing level, indicative of an agency that is actively developing asset management capabilities and establishing them to be consistent, repeatable, and well-defined.
Based on the 2016 assessment results, key challenges were identified and a series of improvement recommendations were developed and integrated into an Asset Management Improvement Roadmap.
In addition, Amtrak established a target position, driving process implementation priorities, with the intention of continuous monitoring by repeating the capabilities assessment process on an annual basis.
2016 Amtrak Asset Management Capabilities Assessment Results
Source: Amtrak Engineering 2019
Defining and Prioritizing Improvement in TAM Approaches
Agencies managing different types of assets are faced with the decision of where to prioritize advancing formal asset management. Determining where to improve the organizations effort can depend on different factors, but should always align with the organizational context and priorities.
For transportation agencies, asset management typically begins with the high-visibility, high-value assets, such as pavements and bridges. However, operating the transportation system requires a supporting cast of assets, typically referred to as ancillary assets, that include lighting structures, roadway signs, ITS assets or even operations facilities and technology hardware components. Establishing the appropriate management approach, and future desired approach for each asset is an essential step in strategic planning for asset management, defining boundaries around the effort. Furthermore, for each type of asset, it is important to determine how broadly to define the inventory of assets, such as the decision to include only arterial roads initially or all roads in a network.
Defining Appropriate Management Approaches for Different Asset Categories
An appropriate approach to manage and monitor each asset governed by the TAM framework needs to be established. Depending on the nature of the asset and the level of risk involved, different approaches can be selected by an agency.
Structuring asset management also involves evaluating different management approach- es and defining the appropriate level of maturity. There are several approaches to managing highway assets, each with different data needs, and several ways to structure and implement asset management processes. These include:
- Reactive-Based. Treatment is performed to fix a problem after it has occurred.
- Interval-Based. The asset is treated based on a time or usage basis whether it needs it or not.
- Condition-Based (Life Cycle Approach). Select intervention based on a forecasted condition exceedance interval.
Chapter 4 provides more details on these different approaches to managing assets.
Processes and approaches can range in their level of detail and complexity. This is what forms the foundation of some asset management maturity levels. Much like deciding on the scope of assets to manage, the level of advancement of the asset management processes an agency adopts should depend on the context and readiness of the agency, as well as the problem being addressed. Consideration should be given to the data, processes and tools available to support the asset management approaches and processes, as well as resource availability and capability. It is common for an agency to begin at a simple level and mature over time towards more complex asset management that integrates processes and decision-making.
To accomplish the objective of allocating transportation funding toward the most valuable assets and those with the highest risk to system operation, UDOT developed a tiered system of asset management. Asset Management tiers range from one to three with tier one being the most extensive management plan for the highest value assets.
Tier 1. Performance-based management
- Accurate and sophisticated data collection
- Targets and measures set and tracked
- Predictive modeling and risk analysis
- Dedicated funding
Tier 2. Condition-based management
- Accurate data collection
- Condition targets
- Risk assessment primarily based on asset failure
Tier 3. Reactive management
- Risk assessment primarily based on asset failure
- General condition analysis
- Repair or replace when damaged
Source: Utah DOT. 2018. Utah TAMP. https://www.tamptemplate.org/tamp/053_utahdot/
Prioritizing TAM Improvements
Deciding on the appropriate management approach and level of asset management is a strategic decision that should consider several factors:
Organizational Strategic Goals
The decision of which assets to prioritize should be driven by the organization’s strategic goals. A desire to focus on one aspect of the transportation system over another in order to meet a larger objective can present a good reason for prioritizing some assets over others.
A common consideration for selecting assets to include is the financial value. Monetizing value provides a consistent way of comparing asset classes. In general, assets that are the most expensive to replace or cause the greatest financial concern for an organization fall into the highest priority. Strategic management of these assets means strategic investments over the life cycle of the asset, which will prevent or delay the need for significant additional investment, help avoid premature failure, and allow time to plan for appropriate replacement.
TAM as a concept is heavily dependent on data. Deciding on which assets to focus on based on existing data collection and management practices and will often support achievement of “quick wins.” Data availability does not always indicate strategic priority or risk exposure of the asset, but can still be an important factor in selecting assets to include the cost of collecting and analyzing data to form the basis for more advanced TAM decision making can in some instances be significant, and require new skills and training.
It should be recognized that data does not need to be comprehensive and complete as a basis for TAM decision making. An accepted approach is to group assets into classes (age, type, function) and then inspect a sample set. This can provide important insights to guide long-term planning at minimal initial expense/time. It can also highlight any issues with particular types of assets and allow for more detailed inspections to be undertaken if required. A gap analysis to define future data requirements and determine how to collect this data should be considered for long term TAM outcomes.
Risk of Failure
Often, it can be necessary to consider including assets if the probability and consequence of failure is significant. Assets with a high risk of failure can be a high priority due to the potential losses to the agency and its stakeholders should they fail. Asset management can alleviate or prevent the impact of failure.
Asset Criticality and Network Reliability
Decisions to formally manage certain assets can be based on their importance to the service provided, such as operations, or the importance of the travel paths under consideration. Defining criticality is context specific, but is important, since user experience is based on the journey, not the specific assets. Considering criticality in selecting assets to include in TAM will ensure that the most important assets–those necessary to maintain network reliability–are managed first.
In general, the scope of TAM should be agreed to in coordination with leadership and influenced by stakeholders. Stakeholders can be any asset owners, metropolitan planning organizations (MPOs), cities, tolling authorities, P3 concessions, federal (mandated requirements), and others. The public can also be stakeholders who influence which assets to include, especially when high-profile incidents potentially attributed to the state of good repair occur.
Aurizon is Australia’s largest freight rail operator, transporting more than 500 million tons of coal to markets including Japan, China, South Korea, India, and Taiwan, in addition to over 800 million tons of freight through an extensive network throughout the country. Aurizon Network manages the largest heavy haul rail infrastructure network in the country. The network is economically regulated by the State through a process that sets investment levels and tariffs. Asset management practice is well-entrenched in the organization, with a focus on “optimizing the life of assets, keeping a tension between investment in maintenance and capital.” The scope of the Aurizon Network asset base, known as the Regulated Asset Base includes all assets used in the provision of the rail infrastructure service. Management is informed by external engineering standards and legislative and regulatory obligations including:
- Prevention and intervention levels specified in an Asset Maintenance and Renewals Policy.
- Commitments to the Central Queensland Coal Network.
- A Safety Management System aimed to minimize safety risks.
- Network Strategic Asset Plan models which are based on asset age, predicted condition and historical and forecasted usage.
Source: Aurizon. 2019. Network: Planning and Development.https://www.aurizon.com.au/what-we-deliver/network#planning—development
Developing a TAM Implementation Plan
A TAM implementation plan can clearly communicate an agency’s next steps for TAM and define responsibilities for implementation.
The product of a gap assessment will often take the form of an implementation plan for TAM improvements. These improvements can involve changing behaviors across many business units within an organization. The actions should, therefore, be prioritized and staged to advance one step at a time. When defining actions, it is important to understand the purpose and outcome to be achieved, who is responsible, how long it will take and how many resources are required for it to happen.
Note that a TAM Implementation Plan is different from a Transportation Asset Management Plan (TAMP) described further in Section 2.4. An implementation plan focuses on business process improvement, while a TAMP focuses on an organization’s assets and how it is investing in and managing them. However, the implementation plan may be included as a section of a larger TAMP.
The improvements identified need to recognize potential barriers to implementation. As an example, improving decision-making tools will likely require improvements in data practices. The implementation plan should consider any foreseeable obstacles, including staff resistance to new business procedures, lack of support from agency leadership, inadequate skills among staff, data integration issues or outdated analytical tools.
Communicating the Implementation Plan
Effective, organization-wide communication can serve as a powerful tool to facilitate smooth and swift adoption of the TAM implementation plan. At the start of implementation, communicating the future vision and benefits can help build awareness and buy-in. Throughout the duration of the implementation initiative, communication about milestones and accomplishments can help sustain or regain momentum. Additionally, as different projects are initiated, delivered and completed, agencies will want to ensure that the resulting changes in processes, systems and tools are adopted and used consistently to achieve the intended outcomes and objectives. As illustrated in Figure 2.10, the TAM communication strategy should cover six key elements – why, who, what, when, how and how well.
Objectives. Why communicate?
Establishing early buy-in to the implementation plan by providing an upfront explanation of why execution of the TAM implementation plan is needed—the anticipated benefits for the organization as well as for different stakeholder groups—will help jumpstart success of the implementation.
Stakeholders. Who delivers and receives the communication?
To make sure the right people are receiving the right information, it is key to develop and categorize a complete list of internal and external stakeholders who will be impacted by the TAM implementation plan and its resulting changes. In determining stakeholders, consider who needs to receive different types of information and who best to deliver that information to support achievement of implementation plan objectives.
Messages. What are the messages to communicate?
n developing the key messages to communicate, consider intent – what should stakeholders know, think or do as a result of the message? Key messages should promote awareness, desire and reinforcement of the implementation plan and its associated changes. They should also align with objectives of the implementation plan as well as organizational objectives.
Timing & Frequency. When will the communication occur?
Communication about the TAM implementation plan and corresponding changes should be timely, frequent enough to keep stakeholder groups well informed about approaching milestones and key dates of impact, and not so frequent that they lose value. Take into account what is being communicated and to whom, as different stakeholder groups receiving different types of messages often require different delivery frequency.
Tactics & Channels. How will information be communicated?
Depending on the duration of the TAM implementation plan and the number of associated changes, communication needs often shift over the course of its execution. Agencies should determine the most effective types of communication and delivery channels as they progress through change. By including stakeholder categories, messages and frequency as inputs when determining the most effective channels, the communications strategy remains agile, which facilitates continuous improvement.
Continuous Improvement. How well is the communications strategy working?
Assessing the effectiveness or performance of any strategy is important for achieving objectives. Including a stakeholder feedback loop into the communications strategy is one way to accomplish this. Agencies can use surveys, polls, focus groups or meetings to gather information and gauge opposition and support. This crucial feedback serves as guidance for subsequent content and can lead to changes in the communications strategy.
Figure 2.10 Communicating the Plan
Key questions to answer in communicating your implementation plan.
Clackamas County DOT
Based on their gap assessment, Clackamas County Department of Transportation and Development established a Transportation Asset Management Strategic Plan (TAMSP), which documents its methods to implementing a comprehensive transportation asset management program over a five year period. This TAMSP was accompanied by an asset management implementation strategy that identified the key actions to be undertaken.
Clackamas County, 2051 Kaen Road #426 Oregon City, OR 97045
Extract from Clackamas County DOT Implementation Plan
Monitoring TAM Program Improvements
Measuring TAM improvement is important for understanding if the plan needs adjustment, and to communicate success and motivate those responsible for implementation.
Once a commitment to make improvements has been made, the improvement process needs to be managed and monitored.
Regular updates, meetings, performance tracking (monitoring improving performance against the selected framework) and scheduled reviews by the TAM Governance Groups will help provide oversight to those responsible for undertaking the improvements.
This process also helps remove roadblocks by involving leaders from across the organization.
When to Re-Assess Performance
A regular commitment to monitor progress is important. This assessment will compare progress from the initial benchmark toward the desired level of competency. There is no set recommendation for when to assess progress; some agencies find it more important in the early stages of implementation, while others do not.
When considering the timing of progress assessments, it is important to consider:
- Process checkpoints. The frequency could be aligned with reporting requirements, but should also consider appropriate points where progress will be noticeable.
- Commitment. Undertaking an assessment will take time and resources, so it is important this is balanced against progressing with implementation.
- Champions and change agents. As these individuals are critical to the overall success of TAM implementation, if they change or need to monitor their own performance, then a review of progress can help motivate and reset goals.
Measuring Performance Improvements
Monitoring performance of the asset management system and the results of improvement actions can be challenging, as the cost of service delivery, quality of service levels and risk of service failures may shift over time, and can change following the implementation of an improvement action. The IIMM suggests some of the following potential TAM system performance indicators:
- Financial performance
- Data management performance
- Timeliness relative to target response times
- Productivity and utilization of resources
- Skills availability relative to planned requirements
- Adherence to quality procedures
Chapter 6 provides more information on performance measures, targets, and monitoring asset performance. Self assessment can focus both on service / asset outcomes experienced by users, as well as be internally focused to determine how well the agency is aligned with desired practices. It is important that agencies consider and select the appropriate level and focus of self-assessment for their requirements.
New Zealand Treasury
The New Zealand Treasury stewards the NZ government’s Investment Management System to optimize value from new and existing investments and assets for current and future generations of New Zealanders. One of the tools the system uses is the Investor Confidence Rating (ICR), which illustrates the confidence that government leadership (i.e. Ministers) can have in an agency’s ability to deliver investments that produce the desired results.
The ICR also promotes and provides a pathway for capability uplift. One element of the ICR evaluates the gap between current and target asset management maturity levels on the basis that good asset management practice provides the foundation for good investment management. The Treasury recommends periodic self-assessments using a methodology based on international asset management guidelines and the ISO 55001 standard.
The ICR assessment is conducted every 3 years, resulting in more decision-making autonomy for agencies that obtain a good rating and potential flexibility over investment assurance arrangement.
Adapted from New Zealand Treasury. Investor Confidence Rating (ICR).
Beyond the Basic TAMP
This section contains suggestions for developing a TAMP that goes beyond the basic elements of a TAMP described in the previous section. An agency can expand the scope of the TAMP to include additional asset types and systems. An agency may further tailor their TAMP to address specific needs.
A highway agency focused on complying with Federal requirements will typically focus on including its NHS pavements and bridges in its TAMP. While these assets make up the greatest portion of a typical state highway agency, an agency may wish to include additional assets in its TAMP. Also, the agency may wish to extend the network scope of the TAMP. In updating a TAMP with NHS pavement and bridges, an agency may include other assets, such as drainage assets, traffic and safety features, or the agency may wish to include all of the assets it owns.
For transit TAMPs, the initial focus is on revenue vehicles, facilities and infrastructure, as these are the assets that require the greatest investment. An agency may wish to expand its TAMP to include additional assets that are important to the systems, albeit less costly, such as bus shelters and signage.
TAM Implementation Plan
As described in Section 2.3, it is often helpful to prepare an implementation plan describing a set of planned business process improvements that an agency intends to undertake to strengthen its approach to TAM. There are many examples of TAMPs that focus specifically on an agency’s TAM approach and how it plans to improve its approach. Ideally a TAMP should both describe an agency’s assets and planned investments, and detail how it intends to improve its TAM approach. Where an agency has developed both a TAMP and TAM implementation plan, the implementation plan can be incorporated as a section of the TAMP.
TAM-Related Business Processes
An agency may wish to include a discussion of one or more of the business processes related to TAM in its TAMP. Alternatively, there may be other agency documents that provide more detail on these issues that can be referenced in the TAMP. These areas include:
- Performance Targets. As described in Chapter 5, setting performance targets can help guide the resource allocation process. However, agencies often have broader efforts to establish and track performance beyond the scope of TAM.
- Financial Planning. While developing a TAM investment plan is central to developing a TAMP, often the revenue forecast used to support developing the investment plan is developed separately and used for other purposes beyond the scope of TAM. It may be valuable to document the agency’s approach to forecasting future revenues for TAM and other applications. Chapter 5 describes provides additional detail on this topic.
- Work Planning and Delivery. As described in Chapters 4 and 5, work delivery approaches can impact how assets are maintained over their life cycle, and how resource allocation decisions are made. Some agencies have adopted formalized approaches for evaluating and selecting different work delivery approaches.
- Data Management. Chapter 7 discusses the importance of implementing an approach to data management and governance. Some TAMPs include additional information on this topic given its relationship to TAM.
AASHTO TAMP Builder
The AASHTO TAMP Builder website (available at https://www.tamptemplate.org/) hosts annotated plan outlines to assist agencies in preparing TAMPs. The site also provides resources to customize an outline in order to meet agency-specific objectives and requirements. The website integrates a database of TAMPs, dating from 2005, that support the functionality of the outlines created using the site.
Defining Asset Service and Performance Levels
Before asset performance can be managed, an agency must first define what it is seeking to achieve. In TAM, asset performance is most commonly defined in terms of asset condition or maintenance level of service. Performance may also be evaluated in terms of safety, availability, reliability, resiliency and other service attributes. Regardless of the method used to monitor performance, it should be used to inform analysis that supports decisions to help ensure that investments enable an agency to achieve its goals cost-effectively.
Establishing Desired Levels of Service
Before a whole-life strategy can be developed and implemented, an agency must determine what they seek to achieve. In many transportation agencies, the desired level of service (or asset management organizational objectives, in ISO 55000 terminology) provides the linkage between what the goals of an agency are, and what investments and interventions should take priority when managing assets. High level goals should directly influence investment choices when resource allocation decisions are made. Service levels help establish when gaps need closing to achieve a goal, and merits investment. Chapter 2 discusses ways to create linkages between goals and investment decision making.
When managing the life cycle of existing assets, performance targets are commonly established as a way to manage service levels for the transportation network. How to determine the expected level of performance may vary depending on the type of asset being managed. Level of service targets that are part of performance framework typically are a mixture of both customer focused performance measures, and technical service measures that help those responsible for the asset assess what types of interventions might be required and when. Customer focused service measures are important to road users and other stakeholders that require mobility. Travel time reliability, safety, load capacity and clearances, and lane availability are all examples of service targets that are customer focused. Condition, strength, regulatory compliance and examples of technical service attributes are commonly of greater interest to asset stewards than asset users. Both types are service level targets that are important to evaluate the efficacy, effectiveness and efficiency of a transportation system.
For pavements and bridges, and other assets managed using a condition-based approach, asset condition is commonly used to establish expected technical levels of performance, but also is relevant to customers. For example, condition is employed as a proxy in this way for pavements because it is objectively measurable, deterioration has some predictability. It is a valuable service attribute because often, user experience is also directly connected to condition as well. Potholes, rutting and roughness all reduce quality of service from a pavement. Performance measures, such as those discussed in Chapter 6, are used to establish the desired long-term performance and to set short-term targets that can be used to track progress towards the long-term objectives. For other highway assets, including those managed using interval- or time-based maintenance approaches, performance may be linked to the expected service life, the ability of the asset to fulfill its intended function, and/or other operational factors. For these other highway assets, performance targets are often established as part of a Maintenance Quality Assurance (MQA) program in terms of desired maintenance levels of service (MLOS) and integrated with operational service targets that may also be customer focused.
Risk can also be used as a measure of performance. As described in chapter 2, risk considers both the potential impact and consequence of failure. This can be particularly useful when the potential consequences of failure impact other assets or facilities. An example of how Colorado uses risk to manage rockfalls is included in section 4.3 of this chapter. Additional details on how to track risk-based performance measures is included in Chapter 6.
Establishing a desired level of performance is typically a collaborative process that considers existing conditions, available funding, expected demands on the system, policy goals and guidance, and stakeholder priorities. The desired level of performance is typically established once baseline data is available, so performance trends can be evaluated. The desired level of performance may be adjusted over time to reflect changes in agency performance, changes in asset condition, capacity, safety, resiliency and other factors.
Three types of service expectations are often used in combination to manage asset performance:
- Performance target – the level of performance beyond which additional performance gains are not desired or worth the additional cost. When performance is measured based on condition, the desired performance may describe the desired state of good repair. There may be an expected specific time frame to achieve this desires performance target.
- Current Performance – an intermediate level of performance achieved by the organization and is usually reported relative to the desired target. Target setting is described in more detail in Chapter 5.
- Minimum acceptable performance – the lowest level of performance allowed for the asset or asset class to still function as designed.
Performance expectations may be set for the road network, a road corridor, for individual assets or for a group of assets. Commonly, performance expectations are set using a combination of asset class or subclass or sub network, such as:
- Key network corridors.
- Bridges on the National Highway System.
- Interstate pavements.
- Culverts larger than 10 feet in diameter.
- Traffic signals serving more than 10,000 vehicles per day.
The nature of performance expectations can be either strategic or tactical or operational. Strategic expectations support freight movement; for example, the long-term goal of providing unrestricted flow of legal loads is supported by a performance expectation of no load-posted or restricted bridges on interstate highways. This expectation cannot be accomplished without the tactical delivery of work to address factors contributing to the physical condition of bridges. Thus, an agency may include tactical expectations to perform maintenance and repair on structural members on a routine basis or as conditions warrant. These enhancements can be also integrated with renewal and other rehabilitation interventions to help improve both tactical performance metrics, as well as achieve higher level goals and objectives. Operational improvements such as more responsive snow clearance, and better signage are all integrated treatment options to achieve the strategic objective.
Life cycle management analysis, and the decisions it supports, require service levels, performance targets and other objectives to be able to determine the optimal choices for agencies to select during resource allocation. Over an asset life cycle, a range of interventions are possible, from reactive, routine and preventative maintenance, to large investment associated with renewal, replacement, or removal. Having targets helps select the right interventions and investment option while balancing risk, service and cost.
Connecting performance measures to higher level strategic goals also supports an agency’s ability to communicate how technical measures relate to system performance as experienced by highway users and other external stakeholders, thus tying asset management outcomes to system performance. Asset management measures are often very technical. Performance indicators like bridge ratings, pavement distress measurements, and risk ratings are not commonly understood by those outside transportation agencies. However, agencies can use these technical measures to support the performance indicators that are more commonly understood and prioritized by system users and external stakeholders. Communicating system performance and the status of the road network is discussed in Chapter 2, and is illustrated in several examples below. Customer service level targets are often established for this purpose, and give users an ability to understand the quality of service they should expect on the transportation system.
Each year, the Colorado DOT must report to its legislature on the statewide highway infrastructure and the agency’s ability to meet those needs with available resources. This requirement is met through the Annual Infrastructure Deficit Report, which addresses pavements, bridges, and annual maintenance. The agency supports the annual maintenance portion of this report with its Maintenance Level of Service Measure, which rates the delivery of services in nine program areas in terms of a letter grade from A to D and F. The agency has used historic data to develop deterioration rates for each service area that estimate the resources needed to improve the maintenance level of service by a given amount over a specific time period. These estimates are summarized in the Report, which is in turn used by the Legislature and the DOT to establish the annual maintenance budget. The figure provides an example of information on MLOS in the 2016 Report. Once the targeted MLOS is established, maintenance funding can be allocated to ensure that agency priorities are met.
Colorado DOT Example of Funding Needed to Support Maintenance Levels of Service
Source: Colorado DOT. 2016. https://leg.colorado.gov/sites/default/files/cdot_smart_2017_presentation.1.pdf
Washington State DOT
When seeking to establish the connection between investments and performance across a wide range of assets or roadway attributes such as litter, vegetation height, drainage, or functionality it is helpful to relate all of the various measures of performance to a common rating scale. Washington State DOT has developed its Maintenance Accountability Process to establish the relationship between maintenance level of effort and the resulting level of service. The process rates conditions and services in seven areas using a common letter-grade system, or MLOS.
- Roadway Maintenance & Operations.
- Drainage Maintenance & Slope Repair.
- Roadside and Vegetation Management.
- Bridge & Urban Tunnel Maintenance and Operations.
- Snow & Ice Control Operations.
- Traffic Control Maintenance & Operations.
- Rest Area Operations.
Each group of services or conditions includes several performance measures, which are translated to the MLOS grades of “A” (highest performance), “B”, “C” (adequate performance), “D” or “F” (unacceptable performance). Applying the MLOS grades allows for a consistent means of rating performance across services and geographic regions. Letter grades can also be represented in photographs of facilities that meet the criteria for each condition state to support communications with stakeholder groups. The MLOS are outcome-based measures that allow the agency to predict the expected level of service that can be achieved based on anticipated budget and work planning decisions. By tracking maintenance expenditures and MLOS results annually, Washington State DOT is able to adjust its maintenance priorities and budgets to address system needs and stakeholder wants.
New Zealand Local Government Act legally requires councils to consult with their communities on their long-term plans. The consultation plan provides an effective basis for public participation in infrastructure decision-making associated with the long-term plan. It includes a fair representation of overall objectives, and how tax levels, debt, and levels of service might be affected by the intended plan and can be readily understood by interested or affected people. The Auditor General recently reviewed plans produced by communities across the country. Key findings highlighted aspects that help define good practice:
- Consultation documents present their information in a concise, readable and understandable way.
- Clear and unambiguous explanations on why proposed taxation and debt increases and significant changes in plans or intentions were considered “affordable” or “equitable” make consultation documents more effective.
- Some communities used a road-trip analogy throughout the document. The analogy makes technical subjects relatable without over-simplifying the issues.
- Some used a personalized approach that connected with people. For example, one uses two primary school children, Maia and Xander, who are pitched as the “champions of the Long Term Plan 2018-2038.”
By focusing on the inclusion of transportation customers, New Zealand municipalities are better able to address customer needs, inform customers of the actions they are taking, and refine work planning practices to address concerns critical to infrastructure operations and customer expectations.
Consideration of Risk in Resource Allocation
Uncertainty and risk complicate the resource allocation decision-making process. Risk management activities, including developing a risk register, are helpful in understanding and mitigating uncertainty, which in turn has implications for resource allocation.
All transportation decision-makers must contend with uncertainty. In regards to resource allocation, uncertainty is inherent in variables such as data on asset conditions and performance, future funding levels and costs, how a transportation system and specific assets will perform, and what external events or other factors may require reallocating resources. This uncertainty complicates efforts to make decisions about the future and forces agencies to be nimble so as to effectively respond to unpredictable events and evolving conditions.
In recent years, transportation and other industries have made significant progress developing improved approaches for managing uncertainty to minimize negative and leverage positive impacts. An area of focus in transportation has been in managing the risk of project cost and schedule overruns; a number of agencies have established enterprise risk management programs in order to address risk and uncertainty across their organizations. Likewise in TAM, there is increased interest in identifying and assessing risk so as to comply with both the best practices and the FHWA requirement for state DOTs to consider risk in developing their NHS TAMP.
The word ‘risk’ can be very context specific, meaning very different things depending on the industry and application. For instance, a financial analyst is primarily concerned with uncertainty in financial returns and the risk of incurring a significant financial loss. In the nuclear power industry, however, the focus of managing risk is on minimizing the potential for catastrophic loss that might occur from damage to a nuclear facility. As discussed in Chapter 2, in this guide risk is defined as the “effect of uncertainty on objectives” consistent with the ISO definition. This definition captures the full range of applications of risk management, and acknowledges the possibility for both positive and negative consequences of uncertainty.
The term ‘risk management’ is used to capture the set of business processes associated with identifying and managing uncertainty and risk. The overall risk management process is described in Chapter 2. The remainder of this section describes how this process relates to resource allocation.
Implications for Resource Allocation
While the scope of risk management may be very broad, an organization’s approach to risk management and the outcomes resulting from a risk assessment may nonetheless have important implications for TAM resource allocation. Consequently, it is important to establish a risk management approach and integrate consideration of risk with the resource allocation process.
Specific possible implications of risk management on resource allocation may include, but are not limited to:
- An organization may identify through its risk management approach areas where better data or improved processes are needed to best address a given risk, in turn impacting the resource allocation process. For instance, if uncertainty concerning future asset conditions is found to be a significant risk, this may result in efforts to improve the deterioration models in an agency’s asset management systems and/or motivate data collection improvements to reduce uncertainty.
- An organization may identify specific investments of staff time and/or agency funds required to mitigate negative or leverage positive risk. Once specific investments are identified, they can be assessed along with investments in other asset/investment categories. For example, Caltrans defined a separate program for seismic retrofits as described in the Practice Example.
- If an agency’s allocation of resources hinges on uncertain future values for one or more parameters, it may be necessary to incorporate consideration of uncertainty formally in the decision-making process. This can be accomplished using Monte Carlo simulation or other quantitative approaches to establish the predicted distribution of outcomes. For instance, in performing a life cycle cost analysis to select between project alternatives for a given facility, Monte Carlo simulation can calculate the range of life cycle costs predicted depending on future values for cost escalation, deterioration, or other parameters.
- In approaching formal accounting for uncertainty, an organization may define different scenarios representing the possible range of outcomes and then determine how best to allocate resources in each scenario before establishing a preferred resource allocation approach. For example, if an agency’s future capital budget is unknown, a decision-maker may wish to define a high, medium and low budget scenario and determine what investments would be made in each scenario in order to most effectively prioritize given uncertainty. Likewise, a scenario analysis approach can be useful in assessing how to allocate resources for improving infrastructure resilience given uncertainty concerning future sea level rise. Typically, the decision maker will review results for different scenarios and make a subjective determination of how to allocate resources considering the relevant factors. The Practice Example describing the analysis of harbor-wide barrier systems for the City of Boston shows one such approach. Recent research in the area of Robust Decision Making (RDM) has focused on developing quantitative approaches to select optimal investments between different scenarios.
Caltrans initiated its Seismic Safety Retrofit Program in the wake of bridge failures experienced in the 1989 Loma Prieta Earthquake. Through this program Caltrans evaluated the retrofit needs for all of the over 12,400 bridges on the State Highway System (SHS). Retrofit needs were prioritized using a multi-attribute procedure that calculated a score for each bridge considering the likelihood of an earthquake at the bridge site, the vulnerability of the bridge to collapse in the event of an earthquake, and the impact of a collapse considering the traffic using the bridge and detour distance in the event of a collapse. Through 2014 the program resulted in retrofit of 2,202 state highway bridges at a cost of over $12.2 billion.
2018 Caltrans TAMP
Practical Lessons from the Loma Prieta Earthquake (1994), p. 174-180 https://www.nap.edu/catalog/2269/ practical-lessons-from-the-loma-prieta-earthquake
University of Massachusetts
The Sustainable Solutions Lab at the University of Massachusetts Boston used a scenario-based approach to analyze the feasibility and potential risk reduction of Boston Harbor barrier systems to protect the Boston area from future flooding due to sea level rise. The report included an economic analysis in which costs and benefits were predicted for 32 scenarios considering:
- Two barrier system alternatives
- Two construction time scenarios
- Two scenarios for effectiveness of “shore-based solutions”
- Low and high construction cost estimates
- Discount rates of 3% and 7%
The analysis indicated that the benefits of the proposed barrier system would exceed their cost for both systems evaluated, but only in the case that one assumed a low discount rate, accelerated construction schedule, and failure of other shore-based solutions for mitigating sea level rise. Also, the analysis indicated that beyond a certain point sea level rise would be such that a barrier system would no longer prove effective (since the barrier would have to be closed at all times rather than only during flood events). The report further predicted costs and benefits for two alternative scenarios involving incremental adoption of a variety of shore-based mitigation approaches, and recommended an initial focus on shore-based adaption as the most promising strategy for the City of Boston to address sea level rise.
Regional Municipality of Peel
The Region of Peel is the second largest municipality in Ontario, just west of Toronto and supports two cities and a town. Peel assesses needs and priorities across a diverse portfolio of Infrastructure that supports a variety of programs and services including an arterial roads network, solid waste management, water and wastewater treatment distribution and a variety of social, health and emergency services. The Region integrated a number of inputs to enable an optimized investment methodology including a Risk Management, Level of Service, and Life cycle Management Strategies and priortize needs across diverse infrastructure, as illustrated in the figure. The integration of these three strategies was possible through three enablers and working with all of the programs and services to model their infrastructure:
- Establishing a consistent approach to quantifying risk – The Region evaluates the degree of risk that is currently being accepted associated with delivering service levels. Inherent risk (similar to asset criticality) and residual risk (the Region’s risk objective) are established and the current level of risk that an asset presents to service delivery is also determined. The gap between current and residual risk represents the unmet funding and asset needs.
- Establishing a normalized method to determine current level of service to assist the cross-asset funding allocation task. The adopted normalized indicator was determined to be: LOS=% of Assets Meeting LOS + (% of Assets Not meeting LOS x Average Condition of Assets not meeting).
- Adopting a direct relationship between LOS and risk that allows for an analysis of alternative investment scenarios, and modeling techniques to optimize investment allocation. It also allows annual infrastructure evaluation based on the most current condition information and annual Asset Management Reporting.
Peel’s risk-based approach to asset management is integrated with the Region’s Strategic Plan and the Long-Term Financial Planning Strategy, and supports the desired service outcomes by evaluating risk against the Council approved asset levels of service. This approach provides senior decision-makers an objective way to consider resource allocation alternatives and communicate in a common language when evaluating between service areas and different asset portfolios.
Peel Enterprise Asset Management Plan. 2019. http://www.peelregion.ca/council/agendas/2016/2016-04-07-arc-agenda.pdf. http://www.peelregion.ca/finance/_media/2019-enterprise-asset-management-plan.pdf
TAM Work Planning and Delivery
The approach used to deliver work can have a major impact on what investments an organization makes, the resources required to perform work, and work timing. Transportation agencies have many options for performing work, including using internal forces to perform work, and/or using a variety of different contracting approaches.
Typically, U.S. transportation agencies perform some or most of their maintenance work internally, and contract out a large portion – if not all – of their capital projects. The line between the types work performed as maintenance and capital projects varies by organization and is often blurred. Agencies can often use maintenance forces in a flexible manner to perform a wide variety of activities, including preservation activities on pavements, bridges and other assets. However, in the near term, an organization’s maintenance resources – staff and equipment, in particular – are fixed. Consequently, the asset owner is challenged to optimize use of these resources to meet immediate needs, such as winter maintenance and incident response, while performing additional work to improve asset conditions wherever possible.
The ability to contract out maintenance work, such as through Indefinite Delivery/Indefinite Quantity (IDIQ) contracts, provides an agency with flexibility in meeting near-term needs. Other approaches for contracting out maintenance work include use of portfolio or program management contracts in which certain operations and maintenance responsibilities for some group of assets is delegated to a contractor over a specified period of time. Section 4.3.3 provides additional details on considerations involved in outsourcing asset maintenance.
Regarding contracting approaches for capital projects, in the U.S., most transportation agencies rely on Design-Bid-Build (DBB) model for delivering their capital programs. With this approach, the project owner designs a project (or contracts for a private sector firm to prepare a design) and solicits bids for project construction following completion of the design. This provides the project owner with control over the process, but can be time consuming and can result in cases where bids for project construction exceed the expected cost developed during design. In recent years, many transportation agencies in the U.S. and abroad have explored improved approaches to work planning and delivery to accelerate completion of needed work, leverage alternative financing approaches and transfer program and project risk.
All of these approaches are intended to reduce the time from initial conception of a project to its completion, and in many cases transfer risks associated with project completion from the public sector to the private sector. As these examples help illustrate, major trends in this area include:
- Group work together by geographic location or type of work to develop fewer, larger, and more easily contracted projects
- Use Design-Build (DB), Design-Build-Finance-Operate-Maintain (DBFOM) and other contracting strategies, wherein a single contract is awarded to design and complete a project, as opposed to separate contracts for design and construction
- Encourage development of Alternative Technical Concepts (ATCs), wherein a contractor proposes an alternative approach to meeting a contract requirement in the bidding phase
- Select contractors earlier in program/project development through use of Construction Manager-General Contractor (CM-GC) arrangements, where a contractor is selected as Construction Manager while design is still underway
- Use IDIQ contracts and other flexible contracts to provide a more efficient mechanism for performing smaller projects
- Incorporate performance-based specifications, time-based incentives and other specifications in contracts to improve project outcomes
- Outsource operations and maintenance of an asset using program or portfolio management contracts.
Both in the U.S. and abroad there are many examples of public agencies making extensive use of alternative contracting strategies, such as Public-Private Partnerships (P3s) and performance-based contracts to speed project delivery and transfer risk.
While alternate strategies for work planning and delivery hold great promise, all of the approaches described here have advantages and disadvantages and carry their own risks. Use of alternative approaches can save taxpayers money and provide improvements more quickly than a traditional model. Success stories typically result from improving the efficiency of the process and incentivizing the use of better technology and methods, but there are also many cautionary examples in which these strategies have failed to achieve cost savings, time savings or risk transfers as desired. Asset owners should consult the separate body of research in this area (referenced at the end of this section) when exploring the use of alternative approaches and carefully weigh the expected return, advantages and disadvantages of whatever delivery approaches they consider.
Selecting and Using Performance Measures
This section discusses the importance of using performance data to make decisions. It highlights the role of performance measures and identifies how they are used to establish achievable performance targets. A more detailed discussion of Transportation Performance Management can be found in Chapter 2.
Performance Management Framework
As discussed in Chapter 2, transportation agencies have embraced the use of performance data to drive investment decisions. A performance-based management approach enables agencies to select and deliver the most effective set of projects for achieving strategic objectives, while also improving internal and external transparency and accountability.
A typical performance management framework includes:
- A clear idea of the agency’s strategic objectives.
- The use of performance measures to assess performance.
- Methods to evaluate and monitor performance results.
- The evaluation of factors with capacity to improve long-term performance.
- The allocation of funding to achieve agency objectives.
- Ongoing processes to monitor and report progress.
A fundamental component of the framework is the use of performance measures to evaluate system performance and the importance of establishing business processes to evaluate, monitor, and use the data to influence agency decisions. These are achieved by aligning decisions at all levels of the organization with the agency’s strategic objectives and ensuring that the right performance measures are being used to drive decisions. This alignment helps to ensure that resource allocation decisions and the day-to-day activities of agency personnel support the agency’s priorities and the interests of external stakeholders.
The existence of a regular, ongoing processes to monitor and report results is critical to identifying and implementing improvements to system performance or to further the effectiveness of the performance management process. The continual monitoring and update of a performance management framework is reflected in Figure 6.1, which illustrates inputs to performance targets and how ongoing monitoring and adjustments are fed back into the framework to adjust future targets. The surveys conducted regularly to support a pavement, bridge or maintenance management system are examples of the types of performance monitoring activities fundamental to an effective performance management organization.
Agencies with a performance management framework in place have benefited from:
- Maintaining a clear and unified focus for making agency decisions based on agency priorities, public input and available resources.
- Using available funding more effectively to preserve or improve system performance while lowering life cycle costs.
- Allocating available resources based on analysis of past performance and expected conditions to address areas most in need of attention.
- Having the data to confidently defend funding requests or explain the impact of reduced budgets.
- Building a transparent and accountable organization by communicating the basis for making resource decisions.
- Meeting legislative requirements.
In 2001, during the development of a long-range transportation plan (LRTP), the Arizona DOT took a strategic approach to how investments should be made. Under the new approach, Arizona DOT established the following three investment categories:
- Preservation, including activities that preserve existing transportation infrastructure.
- Modernization, including improvements that upgrade the efficiency, functionality, and safety without adding capacity.
- Expansion, including improvements that add transportation capacity by adding new facilities or services.
To implement the new initiative, the Arizona DOT developed a report titled “Linking the Long-Range Transportation Plan and Construction Program” or” P2P Link” that applied financial constraints to the long-term vision. Through a collaborative process that involved a consultant, local and regional governments, and transit agencies, the Arizona DOT published an implementation plan for putting the P2P Link into practice. The resulting process includes scoring projects based on both a technical and policy score that are added together to determine a project’s ranking. The technical score is generated by the asset owner based on an analysis of the data while the policy score is determined based on each project’s contribution to LRTP goals and performance measures. The process helps to ensure that projects are ranked in accordance with the agency’s strategic objectives using only the most meaningful criteria in a transparent and defensible way.
Arizona DOT’s Link Between Strategic Objectives and Investment Decisions
Source: ADOT. 2014. Linking the Long-Range Plan and Construction Program P2P Link Methodologies & Implementation Plan.
Performance measures are used within a performance management framework to allocate resources and provide feedback on the effectiveness of the activities in achieving overall objectives. Performance measures are indicators used for evaluating strategies and tracking progress. A performance measure can be an indication of asset condition, such as a pavement condition rating, or an indication of an operational characteristic, such as the annual number of fatalities on a facility.
The most effective performance measures drive decisions that are important to the success of the program. For example, maintenance departments may use performance measures that track actual expenditures to planned expenditures to ensure that available funding is directed towards the highest-priority items, as shown in the Colorado DOT practice example.
It is also important that the measures drive the desired performance within an organization. For instance, a performance requirement that measures whether pavement or bridge designs are submitted on time might cause incomplete or incorrect submittals to meet a deadline, leading to an increase in construction modifications. A more effective measure might focus on a minimal number of design modifications during the construction phase of a project.
Effective performance measures should also primarily be outcome-based rather than output-based, meaning that they focus on the result or impact of an activity rather than the inputs that went into the activity. Several examples of outcome- and output-based measures are shown in the sidebar on Page 6-8. Outcome-based measures are generally preferred because they indicate the effect on the traveling public resulting from the actions taken, so they usually relate to user priorities such as the length of time for a road to be cleared after a snow event or the absence of litter and graffiti. They are developed based on a description of what an agency wants to achieve as a result of the actions undertaken. Outcome-based measures are commonly used for managing ancillary assets such as drainage assets and signs. For instance, the performance of drainage assets might be reported in terms of the percent of pipes/culverts greater than 50 percent filled or otherwise deficient and the performance of signs might be reported in terms of the percent of signs viewable at night.
Output-based measures, on the other hand, track the resources used to achieve the outcome, such as the number of hours of labor used or the number of light-bulbs changed in a month. While the data is important information for managing resources, it does not necessarily drive outcomes that would matter to the public. For instance, travelers on a highway are much more interested in knowing when the road will be cleared of snow than how much overtime went into the operation.
When possible, agencies should use performance measures that are leading measures rather than lagging measures to influence future decisions. A leading measure uses changes in performance to provide insights into potential changes that might influence a future decision one way or another. For example, knowledge that a ramp meter has exceeded the manufacturer’s suggested service life might drive a decision to replace that meter. Similarly, increases in equipment downtime might indicate risks due to an aging fleet are growing or that planned operational activities will not be performed as planned. A lagging measure, on the other hand, looks back on the results of past investment strategies after the decisions have been made. Because a lagging measure is recorded after the fact, there is a delay (lag) in the agency’s ability to adjust its practices and improve performance. Bridge and pavement condition measures are examples of lagging measures because the reported conditions reflect the impact of decisions made several years in the past. Lagging measures are commonly used to evaluate a program’s effectiveness or to verify that actual investments achieved projected results.
In transportation, an agency might have a lagging measure for tracking complaints responded to within a 48-hour window. The measure provides an indication of the public’s satisfaction with the road network and is easy to monitor and report. However, if an agency really wants to effect change, it might develop leading measures to track the percent of complaints not worked on within a two-hour window or the percent of complaints that can’t be resolved by the initial point of contact and must be passed to someone else. Focusing on these types of measures could drive agency decisions to ensure complaints are being worked on quickly and are being assigned to the right people. General characteristics of effective performance measures are presented in Table 6.1.
North Carolina DOT
The North Carolina DOT authorizes its divisions to determine how funding will be used for maintenance activities and uses performance data to assist with this activity. Each year, Division Engineers submit annual plans detailing what work will be accomplished; these plans are reviewed quarterly with the Chief Engineer to discuss actual versus planned work. Their accomplishments are also displayed in a dashboard for internal use, as shown in the following image. Public-facing dashboards are also available showing overall conditions and performance trends. The Division Engineers are also held accountable for their performance, since their planned and actual performance data are incorporated into their annual evaluations.
Source: Leading Management Practices in Determining Funding Levels for Maintenance and Preservation. Scan Team Report, NCHRP Project 20-68A, Scan 14-01, National Cooperative Highway Research Program, May 2016.
Use of Performance Measures
Performance measures are used to:
- Connect agency policies and objectives to investment decisions.
- Establish desired and targeted levels of service that consider past performance, current and future demand, stakeholder priorities, and anticipated funding.
- Align agency policies, investments, and day-to-day practices in a meaningful and easily understood manner.
- Prioritize investment needs.
- Monitor and report progress towards desired objectives to both internal and external stakeholders in a consistent, cost-effective, and transparent manner as illustrated in practice examples from the Washington State, North Carolina, and Virginia DOTs.
Table 6.1 – Desired Performance Measure Characteristics
|Measurable with available tools/data||May require no additional cost for data collection|
|Forecastable||Enables data-driven target setting based on future conditions|
|Clear to the public and lawmakers||Allows performance story-telling to customers and policymakers|
|Agency has influence over result||Measures agency activities rather than impact of external factors|
The Washington DOT uses its Maintenance Accountability Process (MAP) to comprehensively manage maintenance budgets and to communicate the impacts of policy and budget to both internal and external stakeholders. Field condition surveys are conducted annually to assess the condition of 14 assets on the highway system such as signs and signals, ITS assets, tunnels, and highway lighting. For each asset, a level of service target is established, based on expected funding levels and importance of the asset to the agency’s strategic objectives. The targeted and actual performance is summarized on a statewide basis and presented to the legislature, media, internal stakeholders, and other DOTs in a format similar to what is shown in the figure (https://www.wsdot.wa.gov/NR/rdonlyres/8EC689DF-9894-43A8-AA0F-92F49AC374F5/0/MAPservicelevelreport.pdf). In 2018, Washington State DOT achieved 77 percent of its highway maintenance targets. Targets that were not achieved are shown as red bullseyes and areas where the targets were exceeded include a checkmark with the bullseye. The results illustrate where additional investment is needed on a statewide basis and provides a basis for setting maintenance priorities during the year.
Targeted and Actual Performance Results Used to Set Maintenance Priorities
Source: WSDOT. 2017. Multimodal Asset Performance Report. Washington State DOT. https://wsdot.wa.gov/publications/fulltext/graynotebook/Multimodal/AssetPerformanceReport_2017.pdf
To support accountability, credibility, and transparency, the Washington State DOT publishes its quarterly performance report, referred to as The Gray Notebook. Each edition of the Gray Notebook presents updates on multimodal systems’ and programs’ key functions and analysis of performance in strategic goal areas based on information reported to the Performance Management and Strategic Management offices of the Transportation Safety and Systems Analysis Division. Washington State DOT also publishes its Gray Notebook Lite, which highlights key metrics referenced in the Gray Notebook in a format for quick reading. Examples from each of these documents are presented in the figures.
The Gray Notebook and the Gray Notebook Lite
Source: WSDOT. 2019. https://www.wsdot.wa.gov/Accountability/GrayNotebook/
Performance dashboards are also a popular way to present progress, using color-coded indicators similar to those on the dash of an automobile. An example of the interactive dashboard available from the Virginia DOT is shown in the figure. The screen reports performance in seven areas (performance, safety, condition, finance, management, projects, and citizen survey results) and the needles indicate whether the performance is within targeted ranges. Hyperlinks are available in each area if a user wants to explore historical trends or explore performance objectives in more detail.
Virginia DOT’s Performance Dashboard
Source: Virginia DOT. 2019. http://dashboard.virginiadot.org/
Future Directions in Performance Measures
As agencies advance the maturity of their practices and move towards investment decisions across assets and modes (as discussed in Chapter 5), there is increasing interest in the use of leading measures and asset performance measures other than asset condition.
Asset management plans document the processes and investment strategies developed by an agency to manage its infrastructure assets. These asset management plans support an agency’s performance-based planning and programming processes for making long-term investment decisions and feed shorter-term project and treatment selection activities. Together, these activities ensure the investment decisions of an agency are aligned with performance objectives and goals.
Examples of these types of measures include:
- Financial Measures – Internationally, financial performance measures have been used successfully to express whether the level of investment has been adequate to offset the rate of asset deterioration or depreciation. For example, the Queensland Department of Infrastructure and Planning uses an Asset Sustainability Ratio defined as the capital expenditure being made on asset renewals (e.g., improvements) divided by the depreciation expense (discussed further in Chapter 4). If the ratio is less than 100 percent, the level of investment is not adequately replacing the depreciation occurring each year. Queensland also uses an Asset Consumption Ratio comparing the current value of the depreciable assets to their replacement value in order to show the aged condition of the assets.
- Life Cycle Measures – A life cycle performance measure is a relatively new leading measure, promoting the selection of sound, long-term strategies best able to maximize performance at the lowest possible cost. There are several life cycle performance measures under consideration by the FHWA, including the Remaining Service Interval (RSI), which is being validated under a research project. The RSI is based on identifying a structured sequence of the type and timing of various repair and replacement actions needed to achieve a desired LOS over a long timeframe at the minimum practicable cost. The results of the RSI evaluation may be used to generate a Life Cycle Impact Factor, summarizing the difference in life cycle costs associated with the various strategies being considered.
- Sustainability Measures – With an increased focus on identifying long-term sustainable solutions to transportation system needs, agencies may seek to develop new sustainability performance measures in order to properly indicate the impact a proposed solution may have on environmental conditions. The use of a recycling measure for gauging the amount of recycled material used in road construction is an example of this type of measure, as are measures for monitoring carbon dioxide emissions.
North Carolina DOT
The North Carolina DOT has an interactive Organizational Performance Scorecard that provides an online indicator of the Department’s success at meeting targets in the following six core goal areas:
- Make Transportation Safer.
- Provide Great Customer Service.
- Deliver and Maintain Infrastructure Effectively and Efficiently.
- Improve Reliability and Connectivity of Transportation Systems.
- Promote Economic Growth Through Better Use of Infrastructure.
- Make NCDOT a Great Place to Work.
An example of how the information is shown; it presents the target for an overall infrastructure health index and the most recent results. As shown by the red “x” in the box on the far right, NCDOT is not currently meeting its target of a health index of 80 percent or more.
North Carolina DOT’s Organizational Performance Scorecard Website – Excerpt
Source: NCDOT. 2019.https://www.ncdot.gov/about-us/our-mission/Performance/Pages/default.aspx
Evaluating the Effectiveness of Performance Measures
Because of the important role performance measures have in supporting performance-based decisions, agencies should use care in selecting measures that drive the right types of results. This section introduces several approaches to evaluate the effectiveness of an agency’s performance measures.
In its handbook for agency executives, AASHTO suggests an assessment of performance measures should consider the following:
- Is the number of performance numbers reasonable? – An agency should retain performance measures addressing critical areas of importance that are maintainable with time. The Maryland and New Mexico DOTs have approximately 80 measures reviewed on a regular basis, but the Florida and Pennsylvania DOTs use approximately 15 to 20 measures to review strategic performance. Some agencies identify a small number (< 10) of KPIs selected from the pool of operational and tactical measures that best reflect an agency’s progress toward achieving its overall goals.
- Are the measures meaningful? – Some agencies choose only to use easily measured performance activities because the information is easy to obtain. However, other measures may do a better job of driving good decision making.
- Does the level of detail in data collection match the level of detail required to drive decisions? – Agencies should balance data availability with the analytic rigor used to make decisions. For instance, if pavement markings are replaced every year, it is not necessary to collect retro-reflectivity information annually. Similarly, collecting data on one lane of a two-lane highway may be enough for approximating the condition across the full width of the roadway.
- Do they support the right decisions? – The performance measures should drive decisions in support of strategic objectives. For example, a performance measure based on the amount of overtime incurred after a snow event is less effective than one able to monitor the number of hours until the roads are cleared.
- Are existing data sources reliable? – In most situations, existing data can provide the information needed for performance management, but it must be reliable and maintained regularly to be useful.
An assessment of performance measures can be important, since many organizations find that over time, the number of performance measures they are managing can become unwieldy.
After using performance measures for years, the Pennsylvania DOT recognized that the number of measures being used had increased to a level that was difficult to manage. In 2011, the Pennsylvania DOT conducted an assessment of their performance measures using the following series of questions to guide their decisions as to which measures to keep, which to change, or which to delete:
- Who is using the measure?
- What exactly is being measured?
- Why is this particular measure needed?
- Whose performance is being measured?
- Is the performance goal defined?
- Does a similar measure already exist?
- Is the existing measure meeting the needs and intent or should it be modified?
If a measure was needed where no measure exists, the following additional questions were used:
- Does the measure affect continuous improvement?
- Is the data for the measure updated as frequently as needed? Should it be updated monthly, quarterly, or yearly?
- Is the measure easy to quantify?
- Is the measure easy to understand?
- Is it clear who owns the measure?
- Does the measure provide a means of comparison?
- Have unintended consequences been investigated?
- Can the unintended consequences be successfully mitigated?
The process has helped to ensure that the agency is focused on the right measures to drive desired results and behaviors. The analysis found several issues that could be addressed, including eliminating duplicate or overly complicated measures, modifying measures that were driving unintended consequences, and resolving data quality issues.
As discussed earlier, performance measures are used to set desired or targeted levels of service. Targets may be short-term, such as the 2- and 4-year targets state DOTs are required to submit to FHWA, or they may be long-term targets, such as the desired State of Good Repair (SOGR) serving as the basis for an agency’s TAMP.
Performance targets are evaluated using the “SMART” method, which evaluates whether targets are:
- Specific. The performance is explicitly described.
- Measurable. Progress towards the target can be monitored in a consistent manner.
- Achievable. The target considers past performance, expected changes in demand, available resources and other considerations that make it realistic.
- Relevant (also referenced as results-oriented). The target should be meaningful to the agency and drive the right outcomes.
- Time-related (also referenced as timely or time-bound). There is a stated timeframe for achieving the target.
The Nevada DOT recognized that although performance measures were being reported regularly, they were not driving agency policies or decisions. The assessment evaluated the performance measures being used in each of the five key performance areas shown in the figure as well as the organizational culture to support performance management.
The study recommended improvements to emphasize the importance of messaging in order to advance the agency’s performance management culture, extend the performance culture beyond the headquarters office to field staff, and develop job performance plans emphasizing accountability at the division, office and unit levels. The study also recommended the periodic review of performance measures to ensure their continued relevance to agency business processes.
Nevada DOT’s five key performance areas and measures
Source: Nevada DOT. 2017. Adapting a Culture for Performance Management at the Nevada Department of Transportation.
In simple terms, benchmarking is a process of comparing performance and practice among similar organizations as part of an agency’s continuous improvement activities. Benchmarking provides an opportunity to learn about approaches used by high-performing organizations to uncover noteworthy practices, inform target-setting activities, or to foster innovation and improvement within an agency. Benchmarking should focus on improvement and lessons learned rather than as a way to penalize underperformers.
As mentioned in Chapter 1, AASHTO has developed a comparative benchmarking tool for enabling state DOTs to compare performance outcomes and practices with peer agencies as part of their continuous improvement activities (http://benchmarking.tpm-portal.com/). This includes a peer selection tool, so agencies can compare practices to peers with similar characteristics. It also features a performance comparison tool with a number of chart options enabling agencies to compare results. For instance, an agency may elect to compare pavement smoothness characteristics with a neighboring state. There is also a portal to facilitate the exchange of practices among registered DOT users through a Notable Practice Narrative.
An example from the AASHTO TPM Portal showing a comparison of bridge deck percentage determined to be structurally deficient is shown in Figure 6.2. Similar comparisons are available for safety, environmental, and non-motorized (bicycle and pedestrian) performance measures. For transit agencies, Transit Cooperative Research Program (TCRP) Report 141, A Methodology for Performance Measurement and Peer Comparison in the Public Transportation Agency, provides specific guidance for comparing performance with other agencies.
Figure 6.2 Example Performance Comparison from the AASHTO TPM Portal
Source: TPM Portal. 2019. http://benchmarking.tpm-portal.com/compare/bridge-condition/deficient-bridges
Internationally, ISO standards include the conduct of periodic internal audits to help an agency evaluate whether its asset management program and components meet the agency’s needs, adhere to best practices and are being used to support decisions. In addition, agencies use auditing for service providers to confirm contract compliance in situations where road network maintenance and management activities have been outsourced.
Types of Performance-Based Data to Monitor
This section describes the types of information that should be collected and maintained to support performance-based decisions for physical assets. This section focuses on asset inventory and condition information for life cycle management, but recognizes that other operational performance characteristics may be important to determine whether an asset is fulfilling its intended function.
Differences in Performance and Condition
The terms ‘performance’ and ‘condition’ are often used interchangeably, although they have different meanings in a performance-based environment. The performance of an asset relates to its ‘ability to provide the required level of service to customers3’ while condition is generally considered to mean the observed physical state of an asset, whether or not it impacts its performance. For example, a bridge with scour may continue to perform adequately in the short-term even though it may receive a low National Bridge Inventory (NBI) rating because of the deterioration.
An asset inventory provides information other than performance data important for estimating the amount of work needed, identifying the location of work in the field and determining characteristics capable of influencing the type of work to be performed. The RCM approach introduced in Chapter 4 can be used to help an agency determine what information is needed to support the management of each type of asset. The asset inventory requirements for those assets managed based on a specified interval for repair, such as pavement markings, is very different than those required for an asset managed using a condition-based approach, such as pavements or bridges. Regardless of how detailed the asset inventory is, it is important an agency establish processes to ensure data quality and keep the inventory current over time.
There are several basic data attributes essential to effectively managing transportation assets, including asset type, quantity and location. Additional information that is important is to differentiate between the types of work to be performed, which may also be added to the inventory, the type of material used to construct the asset, the last time work was performed and factors influencing the use of the asset (e.g. traffic levels, highway functional classification or climatic conditions).
As discussed in Chapter 7, managing asset inventory information using an integrated approach to data management helps promote consistency in asset data across an agency and provides access to help ensure the data is used by decision makers at all levels of the organization. An out-of-date inventory makes it difficult for an agency to estimate work quantities accurately for budgeting purposes.
Asset condition information is used to determine how assets are performing and how performance changes over time. The lack of condition information may lead to premature or unexpected failures with the potential to be very costly, negatively impacting system performance and increasing agency risks. Methods of collecting asset condition information are discussed further in Chapter 7. To ensure that condition information remains current, it is important that the information is updated on a regular basis.
There are several approaches for assessing asset conditions, each of which is influenced by the type of asset and the resources available to support the process. Typically, an assessment of asset condition involves a method of evaluating the presence of deficiencies and/or deterioration at the time of inspection. The results are used to assign a rating or LOS used to determine the need for maintenance, rehabilitation or replacement now or in the future. Asset condition ratings may also be used to establish rates of deterioration, allowing an agency to forecast future conditions for planning purposes.
Examples of commonly used types of asset condition ratings are listed below.
- A pavement condition index based on the type, amount and severity of distress present, which could be on a 0 to 100 scale, with 100 representing an excellent pavement.
- The National Bridge Inventory (NBI), which assigns a rating between 1 and 9 based on the deterioration present in each element (deck, superstructure, substructure and culvert).
- A LOS rating of A to F for maintenance assets, such as the percent blockage in a culvert or the percent of guardrail not functioning as intended.
Maintaining asset condition information is important for evaluating performance to determine whether improvements are needed to achieve the agency’s strategic objectives. The lack of current condition information, or a lack of confidence in the condition information, makes it difficult to present investment needs to stakeholders with any degree of confidence.
The results of condition surveys or inspections are used to evaluate the performance of each asset in terms generally understood by stakeholders, such as Good, Fair or Poor.
It is common for transportation agencies to report the percent of the network in Good or Fair condition or the percent of drivers traveling on roads in Good and Fair condition. Asset performance can also be reported in terms of a health index, such as the Remaining Service Life (RSL) used by some state DOTs to indicate the amount of serviceable life left in the asset. In the maintenance community, some state DOTs have developed a Maintenance Health Index or overall LOS grade to represent the performance of the entire Maintenance Division rather than report the grades of each category of assets separately.
Asset performance also influences overall system performance, as demonstrated by the impact on system reliability associated with unplanned road or bridge closures due to flooding or an on-going lack of maintenance. Performance data related to delay, unplanned closure frequency, GHG emissions, and crash locations may all be impacted by asset conditions and affect an agency’s ability to achieve its broader, strategic performance objectives such as system reliability, congestion reduction, environmental sustainability, and freight and economic vitality. For example, it is important to monitor performance characteristics such as travel time reliability to determine whether capital improvements are needed to add additional lanes or whether ITS assets could improve traffic flow during peak periods.
The Ohio DOT recognizes the importance of integrated management systems to support both life cycle and comprehensive work planning activities. One of the tools developed by the Ohio DOT is its Transportation Information Mapping System (TIMS), which enables planners, engineers and executives to access and manage key asset, safety and operational data in an integrated web-mapping portal (https://gis.dot.state.oh.us/tims). The portal is available to both internal and external stakeholders and allows users to access information about the transportation system, create maps or share information. The data integration efforts enabling TIMS are now underpinning all management system implementations.
Maintaining Asset Data
This section describes several approaches to keeping asset inventory and condition information current, so it can be used reliably to track accomplishments and evaluate current and future needs. The methodologies used to collect the asset information is discussed in Chapter 7.
Maintaining Inventory Information
One of the challenges transportation agencies face is keeping their asset inventory current, because it can require business processes dependent on individuals or agency work areas that differ from the primary asset owners. For example, construction may be responsible for installing new guardrails as part of a pavement-resurfacing project, but the information is not always made available to the maintenance division responsible for budgeting and scheduling guardrail repairs.
Establishing Processes to Update Inventory Information
Some types of inventory information change regularly while other information changes infrequently. As a result, it is important to classify each type of data and establish procedures in order to ensure the inventory is updated as information changes. An agency should establish business processes to ensure any changes to the inventory are reflected in relevant databases. For example, each time a pavement improvement project is completed, the database should be updated with information about the new surface type, the project completion data and the other assets replaced as part of the project. Establishing these processes and holding individuals responsible for updating this information are important for the ongoing success of a performance-based management approach.
Maintaining Condition Information
Asset condition and performance information must also be updated on a regular cycle. In some cases, data collection cycles are mandated by regulations, such as federal requirements for reporting pavement and bridge condition information on the National Highway System. Where there are no requirements in place for condition reporting, the update frequency should be determined based on the resources available, how the asset is managed and the data analysis cycle. Different update frequencies may be established for different types of assets.
Asset condition information may be collected based on a regular interval schedule or an inspection may be triggered based on the asset’s condition. For example, an asset in poor condition may require inspection more frequently than an asset in good condition. In general, asset information is updated on a 2- to 4-year cycle, but in some cases asset data is collected more frequently. For instance, some agencies collect performance data on maintenance assets several times a year to ensure they are in good working order and performing as expected. The condition of other assets with a slower rate of deterioration may be conducted less frequently.
The Virginia DOT maintains most of the assets on state roads and regularly assesses the condition of those assets for determining investment needs. For pavements and bridges, there are asset leads at both the central office and in the districts to monitor conditions and update the database based on work completed. Asset leads at the central office manage statewide data monitoring and analysis and provide guidance on the work that is needed. The asset leads in the districts are responsible for implementing the work and recording completed work in the bridge and pavement management systems so the information is always current.
Using Trend Data to Make Program Adjustments
This section illustrates how some agencies have successfully used historical trends to make program adjustments.
Adjusting a Program Based on Trends
The availability of historical trends is integral for making future projections as part of the planning and programming process. As shown by the examples included in this section, agencies have used trend data creatively to make program adjustments and more effectively align planned investments with strategic objectives.
The Minnesota State Highway Investment Plan (MnSHIP) outlines a 20-year strategy for investing in the state highway system. The most recent document, published in 2017, outlines investment priorities for the period from 2018 to 2037 (http://minnesotago.org/application/files/3414/8431/5979/ MnSHIP_Final_Jan2017.pdf). One of the figures included in the plan uses historical inflation trends to illustrate the declining purchasing power of revenue due to construction costs growing at an annual rate of approximately 4.5 percent. This cost growth rate exceeds the projected annual revenue growth rate of approximately 2 percent, which is expected to erode over half of the buying power of revenues by 2037. As a result of this analysis, MnDOT was able to communicate its financial situation with stakeholders and could better manage the risks associated with continued construction cost increases over the planning period.
Anticipated Construction Revenue by Year Including Adjustments for Inflation
Source: Minnesota DOT. 20-Year State Highway Investment Plan 2018-2037.
The Illinois DOT used a graph showing the number of miles of state-maintained roads in need of unfunded rehabilitation or reconstruction, which was referred to as the backlog. The graph, shown in the figure to the right, illustrates the fact that the backlog was growing over time due to the inadequacy of funding. The increasing trend in backlog prompted the Illinois DOT to reconsider its approach to selecting projects and treatments, moving towards the increased use of preservation treatments to slow the rate at which pavement conditions drop into a backlog condition. In addition to the change in treatments, the Illinois DOT developed a new pavement performance measure based on the percent of the network in good enough condition to be a candidate for a preservation treatment. The change in performance measure was intended to shift funding priorities from deteriorated pavements to those that could be kept in good condition for a longer period of time. The changes were documented in the Illinois DOT’s April 2018 Transportation Asset Management Plan (http://www.idot.illinois.gov/transportation-system/ transportation-management/planning/tamp) and were used in developing the fiscal year 2019-2024 Multi-Year Proposed Highway Improvement Program. In addition, new software tools are being acquired to further support this improved approach to managing pavements and bridge assets.
Source: Illinois DOT Transportation Asset Management Plan. 2018.
New Mexico DOT
In 2004, the New Mexico DOT realized that a significant percentage of state-maintained bridges were classified as structurally deficient. To address this issue, the agency targeted increased investments in bridge preservation. Going forward, the agency funded rehabilitation activities for bridges in poor condition and added preventive maintenance activities for bridges in good or fair condition to slow the rate of deterioration on these bridges. As shown in the figure to the right, the program has been very effective in improving bridge conditions. Adding a line to the graph showing the targeted conditions would help convey the impact that the increased preservation expenditures have had on achieving performance objectives.
Trend Showing the Decrease in Structurally Deficient Bridge Deck Area Resulting from Targeted Investments
Source: New Mexico DOT Transportation Asset Management Plan. 2018.
South Dakota DOT
To determine the effectiveness of road investments, the South Dakota DOT uses historical trends and projects conditions for each road category to show whether targeted conditions can be achieved with planned investment scenarios. The figure to the right illustrates the type of graph developed for the Interstate network. As shown, the graph presents both historical and projected conditions based on a Surface Condition Index (SCI) that ranges from 0 to 5, with 5 representing a distress free pavement. Overlaid on the graph is the acceptable condition range, which in this case spans an SCI between 3.8 and 4.2. The graph shows that Interstate conditions gradually improved over time. Although it projects average future conditions to drop, they are expected to continue to fall into the acceptable condition range. The results of the analysis provide the agency with confidence that the planned investments will achieve the desired condition levels over the analysis period. In addition, the projections are updated annually to provide a picture of changing financial trends and funding availability. This allows the DOT to react to any downturn in the projections.
Past and Future Pavement Conditions and Goals
Source: South Dakota DOT Transportation Asset Management Plan. 2019. https://dot.sd.gov/media/documents/SDDOT2019TAMPFHWASubmittalrevised8-28-2019.pdf
Importance of Tracking Work Activities and Treatment Costs
This section describes the factors that should be considered for keeping
a management system current.
Why It Is Important to Track Work Activities and Treatment Costs
Asset management systems, such as pavement and bridge management tools, rely on the availability of complete, up-to-date inventory information to serve as the basis for all system recommendations. At a minimum, the most recent work activity and completion date are necessary for establishing an asset’s age or the length of time since work was last performed. These factors are key to setting a maintenance service interval or predicting the need for future work. Treatment cost information is used to estimate the cost of recommended work activities, so realistic numbers are important for planning and budgeting.
The level of detail required to track work histories is largely dependent on the sophistication and maturity of the asset management program. It is important to have access to information indicating when the asset was installed or constructed, or when the most recent major work activities were performed. Additional information about maintenance activities performed to preserve or improve the asset is beneficial if it can be provided efficiently and incorporated into decisions about managing an asset over its life cycle.
An agency should incorporate completed work activities into a management system at least annually, at the end of each construction season. At a minimum, the asset management database should be updated to reflect any changes to the asset properties, such as a change from a concrete to an asphalt pavement, and the date when the change was made.
Including the cost of maintenance and rehabilitation activities in a computerized maintenance system provides a historical record of how treatment costs have changed over time. The information from the management system, as well as bid documents, can be used to establish unit costs for each type of work activity possibly recommended by the system. Unit prices for each work activity included in the system are needed.
For many transportation projects, improving the condition of the asset is only one part of the total cost of a project. There are many other costs to incorporate into the unit price when estimating the cost of a treatment recommendation, including the cost of pavement markings, guardrails and signs on a pavement project. If these costs are ignored, the cost of a project will be underestimated, and an agency may program more work than can be constructed over a given timeframe. Some agencies inflate treatment costs by a factor of 30 to 40 percent to ensure the costs associated with project design and the improvement of ancillary assets are considered in the unit cost for a given treatment. Using this approach, $0.30 to $0.40 is added to every dollar associated with the cost of the work itself. The inflated cost (e.g., $1.40) is stored in the management system as the unit cost for estimating treatment costs.
Different unit costs may also be established to reflect different costs in urban and rural areas, or in different geographic regions of a state. These differences improve the accuracy of asset budgeting activities by reflecting the realities agencies face due to work activities in highly congested areas, differences in the availability of contractors and the scarcity of materials in certain areas.
In addition to being used to estimate budget needs, treatment cost information serves many other purposes. For instance, the cost of a proposed project and its expected life can be used to determine a Return on Investment to help ensure that the most cost-effective projects are being selected. The information can also be used to compare the effectiveness of one treatment over another, or one life cycle strategy over another. Cost information has also been used to demonstrate the benefits to using proactive maintenance across a transportation network rather than reactive maintenance.
Montana, Tennessee, and Utah DOTs
Several state DOTs are employing the use of technology to track maintenance work activities as noted below.
When new assets are installed as part of a construction project for the Montana DOT, Construction personnel are required to provide Maintenance with the information needed for updating the asset inventory. Maintenance verifies the information provided by Construction before inputting it into the system.
The Tennessee DOT uses an automated data collection van to establish its asset inventory for approximately 20 assets. The inventory is entered into a maintenance management system at a summary level for each county and a “ghosting” technique is used to identify differences in the inventory from one year to the next year.
The Utah DOT extracts is asset inventory every two to three years from the LiDAR collected as part of the agency’s annual pavement condition surveys; however, the DOT is moving towards a continuous inventory updating process that would be the responsibility of Maintenance supervisors.
Establishing Business Processes to Support Work History and Cost Tracking
To ensure that work history and treatment cost information is kept current, business processes should be established to maintain the data over time. This section stresses the importance of building business processes to update the data regularly.
One of the first steps in establishing business processes to support the maintenance of work history and cost information is assigning responsibility to the appropriate person for managing the information. The individual assigned responsibility for updating work history and cost data in the management system is not always the individual responsible for providing the data. For example, some agencies assign responsibility for updating completed work history and treatment cost information to the maintenance or construction division, since they are typically involved in closing out a project. Regardless of who is assigned responsibility for the task, a clear line of accountability should be established as part of the business process.
Establish Processes to Update Work Activities
As discussed in Chapter 7, technology is improving agencies’ ability to track completed work activities, so the information is available for use in an asset management system. The access to handheld data entry devices with map interfaces linked to a centralized database helps ensure all users of the information have immediate access to current and consistent information. Business processes reliant on field personnel to remember to provide information to another data user are generally not sustainable.
To help establish a reliable approach for keeping asset data current, an agency may consider developing a data and process flow map illustrating the flow and use of data across the agency. This type of document helps an agency better understand where the data comes from, where it is stored, who uses the information and what levels of access various users need. A data and process flow map may become part of an agency’s data governance documentation in order to protect the integrity of asset data.
Build Buy-In To Support the Business Processes
Key to the success of any business process is establishing buy-in among the individuals responsible for each required step. This involves familiarizing the individuals with their responsibilities, providing tools and guidance for completing the activities efficiently and effectively and demonstrating how the information is used to support agency decisions.
To ensure that the asset inventory remains current, the Florida DOT assigns district personnel responsibility for maintaining asset inventories and establishes guidance that no data in the inventory can be more than five years old. For new construction projects, it is required that the inventory be updated within 90 days of completion. The Florida DOT district offices develop a Quality Control (QC) plan and perform a QC check on the data at least once a year. The Florida DOT Central Office develops a Quality Assurance Review (QAR) plan and performs a QAR on the district’s QC process and spot checks the data in the field. As a result of these requirements, the Florida DOT has a high degree of confidence in the numbers used for budgeting activities.
Using Work History Information to Improve Models
The availability of current work history and performance data allows agencies to develop and improve models used in a management system to predict future conditions and determine treatment effectiveness. This section describes and illustrates the use of this data to improve existing models.
Developing and Improving Asset Deterioration Models
An important function of an asset management system is the ability to predict asset deterioration rates so changes in condition over time can be modeled for use in planning and programming activities. In the absence of data, models can be developed based on expert judgment, but as historical performance trends are established based on actual data, the expert models should be replaced by or calibrated against the real data.
The AASHTO Transportation Asset Management Guide: A Focus on Implementation (2011) introduced the following thought process to help an agency evaluate their deterioration models and determine whether improved data is needed to enhance future forecasts:
- If there is disagreement with the timing for recommending a treatment, what is the difference? Does a difference of one to two years make a substantial difference to the program? This type of difference is typically the result of the program optimization models.
- If there is disagreement with the treatment, are the differences substantial, such as deck repairs versus bridge replacement? These differences are often the result of treatment rules but may indicate that deterioration rates are not correct. The deterioration model parameters may need to change (e.g., change traffic considerations or geographic location) or there may have been some work performed that the model is not aware of.
- If the differences are irreconcilable, the agency may decide to investigate the model setup and analysis further or may conduct research to see how other agencies have resolved similar issues.
Determining Treatment Effectiveness
The availability of work history and performance data also makes it possible to determine the effectiveness of different types of treatments over time. By adding cost information to an effectiveness analysis, an agency can determine the long-term cost-effectiveness of different treatment strategies.
North Carolina DOT
The North Carolina DOT conducted an analysis to determine the effectiveness of an open-graded friction course and a surface constructed with a FC-2 (friction course) gradation. Data from the pavement management database was used, including inventory data, construction information and pavement condition ratings. The performance data were plotted against the survey year for each pavement section where one of the two types of surface friction courses was applied. The results showed the performance of the open-graded friction course dropped at year 10, while the FC-2 graded surface dropped in performance at year 8. The study also found that all FC-2 sections had received another treatment by year 11. The results from the analysis were used to increase the use of open-graded friction courses across the state.
South Dakota DOT
In 2011, the South Dakota DOT initiated a project to revise the pavement deterioration models developed in 1997 using 17 years of historical pavement condition data. The tool for developing the models included features allowing all the condition-versus-age data points for each pavement meeting the family description (based on surface type and pavement structure) to be plotted on a graph, facilitating a comparison of the historical model and the recommended model based on the updated pavement condition information. In this example, the blue line (labeled as the user-defined model) represents the model being used in the pavement management system for predicting faulting on a thick, short-jointed doweled concrete pavement and the gold line (labeled as the regression equation). The regression analysis on the historical data, represented by the red data points, indicates faulting is occurring at a much more accelerated rate than was previously predicted. As a result, recommendations for addressing faulting were likely lagging the actual need observed in the field.
Illustration Showing How Historical Data can be Used to Modify a Deterioration Model
Source: South Dakota DOT. 2012. Technical Memo/Software Documentation
New Zealand Transport Agency
The Auckland Harbour Bridge corrosion protective coating system has been undergoing regular maintenance since the bridge opening in 1959. Historic practice was to spot abrasive blast corroded surfaces followed by spot painting and applying a full overcoat. While this process was effective in maintaining the protective coating, it also resulted in significant amount of contaminates being discharged into the Waitematā Harbour despite of the precautions being taken.
In an effort to reduce the discharge, various options have been considered taking into account the protective coating performance and longevity, with the aim to achieve the lowest practicable environmental discharge and whole of life costs.
One option involved collecting the abrasive blasting removal of the coating via the use of full scale containment to capture contaminants. However, it was found that this option would require strengthening of the bridge to safely carry the containment under wind loading at a cost of NZ$65M over a 10-year period.
As such, by undertaking a comprehensive review of the coating maintenance, a 40 years Coatings Maintenance Plan was developed. The identified lowest whole of life solution involved:
- On the land spans, use of full containment (where it could be supported from the ground), allowing for the full removal of the coating system via abrasive blasting, and its full reinstatement. These spans are to be left as long as possible before reinstating the protective coating, while ensuring minimal, if any section loss, to the steel superstructure.
- Spot repair and overcoating of other spans to maintain the existing coating for as long as practicably possible. A more proactive intervention approach is also adopted while using abseil techniques to minimise access costs.
- An outcomes based approach for consenting purposes that involved the establishment of low level discharge limits for contaminants deemed to be environmentally safe. This enables small areas of abrasive blasting without full containment for spans other than above land.
Thus, allowing for the continued corrosion protection of the bridge 125,000m2 external surface areas in a marine environment, while providing a cost effective and environmentally responsible solution.
Toronto Transit Commission
The Toronto Transit Commission initiated a review to determine optimal bus life for their fleet as well as assess the potential for hybrid propulsion technology. Through specialized modelling methods, a data-driven approach was used to assess the total cost of ownership (TCO) for their fleet vehicles. This review analyzed historical asset work order records along with other capital and operating expenses to help identify the optimal asset life cycle. The four key areas analyzed were:
- Procurement/Installation: Asset Design Specifications & Procurement Cost
- Operations & Maintenance: Labor, Parts, Fuel (if applicable), Consumable Items and Outsourced Work
- Overhaul/Rehabilitation: Major Asset Refurbishment/Component Replacement Cost (ex. Transit Bus Transmission Rebuild or Facility Rehabilitation)
- Disposition: Salvage Value (End-of-Life)
The model provided insights on when the optimal time to dispose of a fleet vehicle to minimize overall fleet cost, the comparative TCO of different vehicle types, and the relative effect and up-time benefit gained for different operations and maintenance activities or rehab treatments, by engine or other component types used in the fleet. They advanced their understanding of treatment effectiveness and allowed them to make more informed decisions about fleet renewal.
Washington State DOT
The Washington State DOT conducted a pavement life cycle analysis using performance and cost data that demonstrated the cost-effectiveness of its pavement preservation projects. Based on the results that are documented in their Transportation Asset Management Plan (https://www.wsdot.wa.gov/sites/default/files/filefield_paths/WSDOT_TAMP_2019_Web.pdf), the DOT instituted a “one touch policy” requiring all capital projects to have had at least one pavement maintenance treatment by Maintenance or contracted work forces before it can be programmed for a pavement preservation project. This has enabled the DOT to defer capital improvements on pavements by two to three years, or in instances of multiple touches, by four to six years at a very low cost. In 2018, the agency received an additional $6 million to test a similar program on bridges. In addition to being a cost-effective use of available funds, the programs have helped build buy-in among maintenance personnel by demonstrating the importance of the data they collect.
Monitoring and Managing Risks
Risk registers, risk reports, and risk mitigation plans are commonly used tools to track and manage risks. This section describes and illustrates each of these tools.
A risk register is one of the most common tools for tracking and managing risks within an agency, since it provides a framework for capturing critical information about each risk, its importance to the agency, mitigation plans and tracking and managing responsibilities. A risk register is typically generated as a spreadsheet, though other formats are available. An example of a comprehensive risk register, which includes assignments for risk mitigation strategies, is presented in Figure 6.5. Over time, columns may be added to indicate when the risk information was last updated, what further action is required and whether adequate progress is being made towards the mitigation strategy.
A risk register should be reviewed at least quarterly to evaluate whether the risk register or the risk management plan for any of the performance areas needs to be updated. Periodic changes to the risk profile may be obtained through executive staff meetings meant to evaluate progress regularly, or ongoing reports tracking risk mitigation efforts and results. Annually, the agency may determine whether any strategic-level risks should be adjusted based on evaluation of the agency’s performance and the risk reports provided by the risk owners.
Figure 6.5 Excerpt from a Risk Register Showing Responsibility for Risk Mitigation Activities
Source: Tillamook County Public Works Road Asset Management Plan. 2009.
Risk reports, which reflect excerpts from the risk register, may be developed by risk owners to communicate ongoing activities and manage risks at any level of the organization. The type of risk report shown in Figure 6.6 conveys what steps are being taken to address project delivery risks.
Risk Mitigation Plans
Some agencies see benefit in developing risk mitigation plans for their assets to ensure compliance with regulatory programs and help embed risk into all agency business activities. For example, a risk management plan may be developed when a bridge’s risk of failure reaches a certain threshold. These plans identify specific risks and mitigation strategies to undertake in order to reduce the likelihood or impact associated with the risk.
Figure 6.6 Sample Risk Report
Source: AASHTO. 2016. Managing Risk Across the Enterprise: A Guide for State DOTs.
Washington State DOT
The Washington State DOT recognized the potential safety risk to highway travelers and the adverse impact on regional commerce associated with unstable slope failure. To become more proactive in managing this risk, WSDOT developed the Unstable Slope Management System (USMS) that provides a method for evaluating known unstable slopes and using the information to prioritize slopes for funding of proactive stabilization efforts. The mitigation objective of the unstable slope management program is to sustain a desired state of good repair and low risk over the life span (> 20 years) of known unstable slopes and constructed geotechnical assets at the lowest practicable cost.
Bay Area Rapid Transit Authority (BART)
BART developed a Local Hazard Mitigation Plan in 2017 to reduce or eliminate long-term risks to human life and property related to hazards such as earthquakes, tsunamis, landslides, flood, sea level rise, wildfire, and drought. The analysis focused primarily on high-priority fixed assets such as passenger stations, substations, switching stations, train control rooms, shops/yards, ventilation structures, and emergency exits. These assets were prioritized based on criticality in terms of the impact of an asset failure on reliable and safe service capabilities. The Local Hazard Mitigation Plan details the potential impacts associated with each hazard type and presents prioritized mitigation actions that were determined by votes from the participating members of a Task Force Committee called the Emergency Preparedness Task Force Committee (EPTFC) that is made up of senior managers from all BART departments. The plan is updated at least once every five years. An example of the type of mitigation strategies that were developed are presented in the Figure below. The Plan has helped identify agency priorities that are being addressed and has fostered collaboration among different Departments to reduce potential hazards.
Monitoring TAM Processes and Improvements
As discussed throughout this Guide, TAM is an on-going process that needs to be monitored regularly to ensure that it continues to support an agency’s business decisions. This section presents tools and methodologies used to accomplish this. It also builds on the application of some of the tools introduced in section 2.5.1, Assessing Current Practice.
A gap assessment is used to identify differences, or gaps, between an agency’s practices with those suggested as part of an established asset management framework. The results of a gap assessment can be used to identify changes in business processes that are needed or can serve as the basis for developing priorities as part of an asset management implementation plan. The gap analysis tool available through the AASHTO TAM Portal is an example of a tool that can be used by an agency to assess practices so they can be compared to desired, or more established, practices. A summary of the gap analysis tool and other frameworks for assessing current practice was presented in Figure 2.6. An example of a chart showing targeted and current ratings in eight assessment areas over a 2-year period is presented in Figure 6.7. While the agency’s targeted, or desired, scores remained consistently at a rating of 5 over both years, the graph is helpful for determining what assessment areas have improved over the 2-year period and which have not.
Lean Six Sigma
A lean six sigma framework uses statistical analyses as part of a continuous improvement approach to evaluate the cause of defects and methodically make improvements to processes to eliminate them. Six sigma is widely used in manufacturing sectors, but can also be applied to many TAM functions. For instance, a six sigma analysis would be useful in analyzing the root cause of defects associated with a poorly-performing asset. Combining a six sigma approach with a Lean framework, which focuses on reducing waste, can help agencies develop more efficient and sustainable processes.
The New Brunswick Department of Transportation and Infrastructure (NB DTI) implemented Lean Six Sigma to better understand and document existing practices and identify where improvements could be implemented for savings or service improvement. The Lean Six Sigma methodology helps to improve performance through a collaborative process that systematically removes waste and reduces variation while improving customer satisfaction. For NBDTI, the application of this methodology has resulted in increased efficiency, cost savings, refined procurement methods, improvements to delivery of operational programs and services, and has supported the application of asset management decision-making to pavements, bridges, culverts, facilities and other transportation infrastructure.
ISO 33000 Process Assessment
The International Organization for Standardization (ISO) has introduced a variety of processes to support Asset Management. ISO 33000 is a standard for Process Assessment, providing a structured approach to help agencies better understand their processes, evaluate the suitability of their existing practices, and to determine the suitability of another organization’s processes as a way of improving practices.
The Balanced Scorecard
The Balanced Scorecard approach was initially developed to enable organizations to make complex tradeoff decisions that balanced different types of performance criteria. For example, the framework could be used to help determine the tradeoff between improving the level of service provided in a corridor with improving environmental sustainability on a statewide basis. The balanced scorecard analysis takes a holistic and balanced approach to these types of issues, by simultaneously evaluating competing and dissimilar needs (such as Customer Satisfaction, Sustainability, and Safety). The advantage to the balance scorecard approach is the fact that multiple measures are considered, rather than a single set of measures that might disregard an important factor in the decision. The results produce a rational set of investment decisions that considers all of the factors that the agency views as most important to the final selection.
Figure 6.7 Example Comparing Assessment Area Scores from Two Different Rating Periods
Source: AASHTO. 2015. Transportation Asset Management Gap Analysis Tool User Guide
Managing Implementation Responsibilities and Processes
Monitoring the implementation of new business processes benefits from a clear definition of roles and responsibilities. This section illustrates approaches that agencies use to assign responsibility for implementation activities so that progress can be tracked.
Assigning Responsibility for Managing Risks and Implementation Activities
A key step in managing risks and other implementation activities is establishing a set of roles and responsibilities for each of the tasks at hand. The risk management process introduced in Chapter 2 includes a step for monitoring risks on a regular basis through a risk register or some other format.
When assigning responsibilities for managing risks, different types of risks are normally assigned to different individuals or divisions within a transportation agency:
- Strategic risks – Impact the agency’s ability to achieve its goals and objectives. Ignoring risks at this level can cascade down to impact programs and projects at other levels of the agency. For this reason, strategic risks are generally assigned to members of agency leadership and may be addressed by incorporating risks into regular management meetings and key policy documents.
- Program risks – Impact an organization’s ability to administer a program in a coordinated way. Risks at this level are typically the responsibility of the program manager ensuring there are effective controls over risk and documenting risk activities.
- Project risks – In many agencies, a project risk management process is in place with responsibility for managing risks assigned to the project manager. At this level, primary responsibilities include managing risks associated with the project scope, schedule and quality.
- Activity risks – Associated with routine activities performed by the agency, such as snow and ice control, incident response and pavement management modeling. Risks at this level are typically managed and monitored by the activity leader.
An agency may elect to appoint a Chief Risk Officer or to create an Enterprise Risk Unit charged with coordinating the agency’s risk processes and training agency personnel on risk management. If such a unit is created, the Chief Risk Officer often reports directly to the agency’s chief executive officer or another high-ranking executive, symbolizing the importance of risk management to the agency. Agencies without a formal Risk Unit may rely on the Asset Management Coordinator to serve in this role. Examples showing how risk roles and responsibilities have been assigned are provided in Figures 6.8 and 6.9.
The success of a TAM Improvement Plan that outlines steps the agency plans to take to enhance its asset management program will also benefit from a clear set of roles and responsibilities for:
- Implementing the suggested changes,
- Monitoring progress, and
- Repeating the assessment periodically.
Ownership for the implementation of the planned enhancements generally lies with the TAM Coordinator in an agency, with specific tasks assigned to one or more individuals with the specialized skills and capabilities that are needed.
A major function of the implementation leader is to ensure that all roles are understood and that the various assignments are being carried out as intended. This may require building buy-in among the team members, who are likely busy with other responsibilities. It is also important that the leader have the authority responsibility to hold individuals accountable for progress, even if they report to a different division within the agency.
Figure 6.8 Risk Types and Owners
Source: Transportation Research Board. 2014. Managing Risk Across the Enterprise: A Guide for State DOTs.
The availability of adequate resources is also important to the successful implementation of an improvement plan. Establishing clear role descriptions that describe the required tasks to be completed and the requirements needed to implement the changes enables an agency to compare the availability of existing staff to the implementation requirements. In some instances, staff may be temporarily assigned responsibility for a particular activity, such as developing a TAMP, to address a specific need.
Figure 6.9 Risk Management Roles and Responsibilities for the Highways Agency, England
Source: Washington State DOT. 2018. Project Risk Management Guide.
Using a RACI Matrix to Assign Roles and Responsibilities
A variety of tools can be used to track roles and responsibilities, including spreadsheets or various type of matrices. One form of responsibility assignment matrix is known as a RACI matrix. The term RACI is taken from the words:
- Responsible. Assigning responsibility for getting the work done or making a needed decision. This is typically the person who gets the work done.
- Accountable. Identifying the person who is responsible for making sure the work is done and is ultimately answerable for the activity or decision.
- Consulted. Recognizing that others will provide information needed to complete an activity.
- Informed. Keeping people aware of progress that is made.
A RACI matrix can be used for virtually any type of activity with a combination of tasks, milestones, and key decisions that will be carried out by several different individuals. It is a common technique used for managing different types of construction, implementation, and monitoring activities and is especially useful when responsibilities are divided across divisions or departments within an organization. For that reason, it is commonly used as part of an enterprise-wide risk management program to help ensure that risks are monitored regularly. An example of a RACI matrix showing responsibilities for adopting an enterprise risk management (ERM) policy is shown in figure 6.10.
Figure 6.10 Example RACI Chart
Source: AASHTO. 2016. Managing Risk Across the Enterprise: A Guide for State DOTs.
City of Seattle DOT
The City of Seattle has a Sidewalk Safety Repair Program to oversee the maintenance of the City’s many sidewalks and curbs to keep them safe and accessible. The Program includes a process for monitoring sidewalk conditions, investigating complaints of unsafe or inaccessible sidewalks, determining repair responsibility (e.g.., adjacent property owner, City, or other utility), using existing conditions to proactively mitigate conditions (beveling and asphalt shimming), and permanently repairing sidewalks that are the City’s responsibility. Repairs are leveraged with other capital projects as much as possible, so coordination with other Divisions is vital to the effectiveness of the program.
Because of the number of Divisions involved in managing sidewalks, the City assigned roles and responsibilities in a RACI matrix, that identifies those with Responsibility (R) or Accountability (A), those that need to be Consulted (C), and those that need to be Informed (I). The RACI matrix developed by the City includes one additional role beyond the four that are commonly included in the matrix. The City of Seattle added an “S” to represent a support role for personnel who might provide information to the process but are not necessarily responsible for completing the activity. The RACI matrix has served the City well by clarifying the responsibilities of each of the Divisions involved in some aspect of the Program so the program looks seamless to the public, as shown on the City’s website (https://www.seattle.gov/transportation/projects-and-programs/programs/maintenance-and-paving/sidewalk-repair-program).
Excerpt From a RACI Matrix Developed by the City of Seattle for Managing Roles and Responsibilities for its Sidewalk Repair Program
Source: City of Seattle. 2019.
TAM Data and Systems
Often organizations maintain data on inventory, condition and needs for individual asset classes in separate, self-contained systems. However, increasingly it is necessary to integrate asset and related data distributed across multiple systems to support decision-making.
As discussed in Chapter 6, there are several different types of information needed for TAM decision making. These include:
- Asset inventory and design information including location, type, quantity, material, and design details. This also includes summary level information about the asset as a whole as well as information about individual asset components (e.g. different pavement layers or bridge elements). It may also include asset valuation information (calculated based on deteriorated replacement cost, historic cost, or fair market value).
- Asset condition and performance information including results of visual inspections, measured condition (such as roughness or cracking for pavements), and computed measures of performance (such as remaining service life or “deficient” status designation). This also includes aggregated network level measures (such as the percentage of pavement in good condition).
- Contextual information such as system or network characteristics, functional classification, highway geometric characteristics, traffic volumes, congestion and reliability, crash history, adjacent land uses, weather and features of the natural environment. This information is helpful for understanding factors that may impact the asset service requirements or goals, physical deterioration, funding eligibility, and/or project needs and constraints.
- Work information including date, cost and scopes of work proposed, scheduled and completed on assets – including installation, replacement/reconstruction, rehabilitation, preservation and maintenance. When projects include multiple assets, it is valuable to itemize the work performed by asset.
- Revenue and funding allocation information including historical and forecasted funds available for asset installation, replacement/reconstruction, rehabilitation, preservation and maintenance – by source; and historical allocations by asset category and work type.
- Analysis information including forecasted condition and needs under varying funding or program scenarios, treatment life or life extension results, or project prioritization ratings or rankings.
Agencies store and manage TAM-related data within several different information systems:
- Asset Management Systems (AMS) – this includes pavement management systems (PMS), bridge management systems (BMS), management systems for other specific asset classes (sign or signal management systems), and systems used to manage information for multiple asset classes. All of these systems are used to store inventory and inspection data, and track work performed on an inventory of assets. They also typically include contextual information needed for modeling and analysis, such as traffic, functional classification, number of lanes, and presence of a median. More advanced management systems may identify and forecast preservation and rehabilitation or replacement needs, and analyze funding scenarios. However, often agencies use multiple systems for this purpose, with separate systems for maintaining the asset inventory and predicting future conditions. Pavement and bridge management systems are typically used as the sources for federal Highway Performance Monitoring System (HPMS) and National Bridge Inventory (NBI) reporting.
- Maintenance Management Systems (MMS) – used to plan and track routine maintenance activities. These systems typically store information about planned and completed maintenance activities and resources (labor, materials, equipment) consumed. MMS may include customer work requests, work orders, and maintenance level of service (LOS) information. Some MMS do not store any asset inventory data. In such cases, work is tracked by maintenance activity category and route section rather than specific asset. Note that there are many commercial Asset Management Systems that provide full functionality for asset inventory, inspection/condition assessment, work planning, and work tracking.
- Program and Project Management Systems (PPMS) – used to manage information about capital and major maintenance projects from initial planning and programming through completion. There may be separate systems for managing programming/funding information, preconstruction/design information and construction phase information. Some agencies integrate data from these various systems to obtain a single source of project information. Project information typically includes a mix of tabular data as well as unstructured data (for example, documents and images). Unstructured data may be managed within an engineering content management system separately from other data.
- Financial Management Systems (FMS) – used to manage and track revenues, expenditures, budgets, grants, payments, receipts, and other financial information. These systems are often supplemented with special purpose tools supporting budgeting, revenue forecasting and analysis.
- Enterprise Resource Planning Systems (ERP) – incorporate features of financial systems as well as a wide variety of other modules for functions including human resources, payroll, purchasing, maintenance management, inventory management, equipment management, project programming, project financial management, and revenue forecasting.
- Highway Inventory Systems (HIS) – used to store and report administrative and physical characteristics of the roads and highways. Federal Highway Performance Monitoring System (HPMS) requirements and the Model Minimum Inventory of Roadway Elements (MIRE) define standard road inventory elements; some DOTs maintain additional elements. HPMS elements include pavement type, pavement condition (roughness, cracking, rutting and faulting), and structure type. These systems may include Linear Referencing System (LRS) management capabilities or, may be integrated with a separate LRS management system. Per FHWA’s All Roads Network of Linear Referenced Data (ARNOLD) requirements, state DOTs must submit an LRS for all public roads to FHWA, linked to their HPMS data.
- Crash Data Systems (CDS) – used to store and report data about collisions and resulting injuries and fatalities; which when combined with traffic data and road inventory data provides information for identifying traffic and safety asset needs.
- Traffic Monitoring Systems (TMS) – used to store and report traffic data, required for federal reporting and used for a wide variety of purposes, including TAM processes for asset deterioration modeling, treatment selection and prioritization.
- Engineering Design Systems (EDS) – used to create design drawings or models including design details for different assets. As agencies adopt 3D object-based design modeling practices, there are opportunities to share information about assets between design models and other asset data systems used across the life cycle.
- Enterprise Geographic Information Systems (GIS) – used to manage spatial information, including asset location. Assets may be represented as point, linear or polygon features; location may be specified based on coordinates and/or based on a linear referencing system (LRS). Asset features maintained within GIS may be linked to asset information within other systems.
- Imagery Databases (ID) – used to store highway video imagery and mobile LiDAR data that can be used for manual or semi-automated extraction of asset inventory.
- Data Warehouses/Business Intelligence Systems (DW/BI) – used to integrate data from source systems for reporting and analysis. These may be tailored for TAM decision support.
- Other – there may be other specialized decision support tools that produce analysis results – for example, tools for life cycle cost analysis, cross-asset optimization, or project prioritization.
Table 7.1 provides an overview of different systems with the types of information they typically contain. Note that this may vary within each agency.
Table 7.1 – TAM Data and Systems Overview (example)
|Asset Inventory, Condition, and Performance||Contextual||Asset Work Information||Revenue and Funding Allocations||Analysis Results|
|Asset Management Systems|
|Maintenance Management Systems|
|Program and Project Management Systems|
|Road Inventory Systems/HPMS|
|Traffic Monitoring Systems|
|Engineering Design Systems|
|Enterprise GIS Databases|
Common components included in computer-based asset management information systems are shown in Figure 7.1. Network inventory, network definition (e.g., location), and asset condition information serve as the primary components in a database, which may or may not be external to the management system. Agency-configured models are used to predict changes in asset condition over time and to determine what treatments are appropriate as the assets age and deteriorate. These models may be developed and updated based on historical condition and cost data.
When developing a computer-based model, an objective (performance, condition, financial, risk) must be defined within the model for it to evaluate these criteria to develop and select optimal strategies. Metrics such as benefit-cost, risk, condition and treatment costs are often used.
A typical pavement management system performs some type of benefit/cost analysis that determines the performance benefits (typically in terms of improved condition) and the costs associated with each possible treatment timing application. By selecting the projects and treatments with the highest benefit/cost ratio, an agency can demonstrate that it is maximizing the return on its investment.
Bridge management systems more typically rely on optimization to perform a single-objective analysis, such as minimizing life cycle costs or maximizing condition, or a multi-objective optimization analysis that considers factors such as condition, life cycle cost, risk of failure, and mobility. Project- and/or network-level benefit/cost analyses are used in a bridge management system to explore all feasible treatment options over an analysis to determine the most cost-effective set of treatments with the highest benefits to the network.
Figure 7.2 shows an example of how the different systems listed in Table 7.1 might be integrated, adapted from the approach used by a U.S. state DOT.
Integrated views of asset information enable insights that lead to better decisions. Information produced by one part of the agency can support decision making across the agency.
Linking information across different systems enables agencies to quickly answer important questions that might have taken hours of staff time without integrated data. Integrating data opens up access to previously siloed data sets across the organization. It allows an agency to reduce duplicative effort, achieve efficiencies and derive greater value from its data. Some questions that rely on integrated data are:
Investments and Accomplishments
- What have we spent over the past ten years on route X in county Y (across all assets and including both maintenance and rehabilitation)?
- What percentage of deficient pavements will be addressed by our current capital and major maintenance programs?
Work Costing and Scoping
- What does it cost us to restripe a mile of pavement markings in each district?
- What locations identified along the linear referencing system (LRS) are planned for next year?
- Do the costs estimated by our pavement management system match what we are actually seeing in our projects?
- If we upgrade our guardrails whenever we do a paving project, how long will it take, and what will it cost to eliminate the current backlog?
- How can we best plan our projects to address multiple needs that may exist along a corridor?
- How many years does our standard mill and fill pavement treatment last for roads in different traffic volume categories?
Tradeoffs and Prioritization
- How should we prioritize our asset replacement/rehabilitation projects, considering not only life cycle management strategies but also stormwater management, safety, congestion, non-motorized, transit and ADA needs?
- How should we allocate our available funds across multiple asset types?
- What assets were on route X in county Y prior to the storm? What will it cost to replace them?
An integrated approach to asset data collection, management and reporting not only makes it easier to answer these questions; it also can reduce costs. Opportunities for achieving efficiencies include:
- Using a single application to manage information about multiple assets.
- Using Data Warehouse/BI and GIS tools to provide reporting and mapping functions rather than investing effort to develop these capabilities within individual asset management systems.
- Gathering data on multiple assets through the same approach – using mobile technology, video imagery and/or LiDAR (see section 7.2)
- Sharing asset data across the life cycle – for example, automating methods for extracting asset data from design plans to update asset inventories (described further below).
Emerging technologies and new data sources are making an integrated approach to asset data management even more important. For instance, there are increasingly opportunities to use data collected from cell phones and connected vehicles that may cut across many asset categories. Also, there has been and will likely continue to be rapid advancement in machine learning techniques, such as for extracting asset data from video imagery or predicting optimal maintenance interventions given a wide array of data. Using these techniques typically requires establishing large, integrated data sets.
In addition, advances in computer-aided design and engineering software are making it possible to integrate asset data across the life cycle and achieve efficiencies and cost savings in maintaining asset inventories. See the discussion in section 7.1.4 below.
Ohio DOT has separate pavement and structures management systems, but integrates both asset and project information within its Transportation Information Management System (TIMS). A separate Transportation Asset Management Decision Support Tool (TAM-DST) allows for a user to combine data from TIMS with other state-maintained data sets to perform analysis and reporting. The application allows for one to consume large quantities of data in a timely manner to help make better choices in planning. See practice examples in Section 2.2.4 and Section 6.2.1 for more information on TIMS.
Planning for TAM Information Integration
There are different levels of integration. In the short term, agencies can integrate the information they already have. In the longer term, agencies can modify and consolidate their information systems. Integrating data for TAM should be approached systematically to ensure agencies achieve a solution that meets their needs and is ultimately sustainable.
Step 1: Establish Requirements
What is the purpose of the integration? To create a publicly available map showing asset conditions and projects for both internal and public use? To create a BI environment for answering a range of questions about asset performance and cost? To integrate asset data across different systems used for planning, design, construction, and maintenance?
Based on the identified needs, determine what data will be integrated and at what frequency. Consider whether this will require historical data, current data, future projections, or a combination.
Early collaboration between business units and information technology units is important to establish a shared understanding of both business needs as well as technical requirements and constraints. A strong business-IT partnership is essential for successful information integration initiatives.
Step 2: Identify and Evaluate Data Sources
Identify the available data sources to meet requirements. Determine where the data reside, and in what form – such as engineering design systems, relational databases, spreadsheets, document repositories, etc. Assess the current level of data quality to make sure that the source is ready for integration, based on discussions with the data steward or through examining the data. For design files/models a key quality consideration is whether established standards have been consistently applied. It is also important to determine the level of spatial and temporal granularity – or what does each record represent (such as a pavement condition observation for a 0.1 mile section for April 2019; a paving project on a 1.5 mile section due to open for traffic sometime in 2020).
Step 3: Analyze Linkages
Identify how different data sources will be linked. Spatial linkages are a good place to begin. If GPS coordinates are used, make sure that the Coordinate Reference System (CRS) used is documented, along with positional accuracy. If a Linear Reference was used, determine what method was used to establish the measure along the route, and what version of the agency’s LRS was used to establish route identifiers and reference points. Find out if the linear reference has been updated to reflect changes in the LRS since the data were last collected (if applicable).
Identify other types of (non-spatial) linkages that may be needed to join different data sources – for example, project numbers, account codes, work order numbers, etc. Agencies may want to profile the data for these elements to understand variations in coding and formats.
Step 4: Design Data Flows and Select Technology Solutions
Based on the requirements, available data, the linkage analysis and the tools and resources available within the agency, design how the data will flow from sources to target systems, and select the technology solutions to be used for performing the integration itself. The target system might be a general purpose enterprise geodatabase, an enterprise asset management database, a BI tool reporting data source, data warehouse, or a data lake. Data Extract-Transform-Load tools are available from data warehouse vendors; simple integration tasks may be accomplished through scripting. There are also a variety of specialized tools available for transforming and combining spatial data, and for extracting data from CAD/3D models.
Step 5: Design and Implement Integration Methods
Develop the technical approach for transforming link fields so that they are consistent across databases and if applicable, joining the different data sets and combining common data elements from the different sources. This may involve spatial processing (such as dynamic segmentation), aggregation, coding conversions, and other transformations.
Short term integration strategies include:
- Creating GIS data layers and making them accessible in available web and desktop-based GIS software. This strategy requires that each data source uses compatible spatial referencing, or can be converted to a common referencing system.
- Creating a database or view combining data from various source systems, and using available BI/Reporting tools to create reports and data visualizations. This strategy requires identifying common “dimensions” across source systems and/or normalizing data so that it can be summarized. For example, if the agency wants to report asset quantities by district or county by year, it will be necessary to make sure that each source has these data elements and that the data can be converted to a consistent set of values.
- Exposing data from authoritative sources as services via Application Programming Interfaces (APIs).
In the longer term, agencies can consider re-architecting or consolidating their systems so that they work better together. A logical way to approach this is to document the “as is” situation and then map out a “to be” architecture. This will allow the agency to chart a path from the current state to the desired future state. It will also provide a framework for capturing requirements for any new systems that are brought into the agency.
Integrated asset management systems are not a new concept and there are several commercial systems that support information management and work planning for multiple asset types. However, some agencies are challenged to integrate information about major assets (pavements and bridges) with information about various other ancillary assets – given that approaches to planning and budgeting for major assets are more sophisticated and require a greater level of detailed information and analysis. Also, it can be a challenge to integrate information about operations and maintenance with capital projects given differences in how work is categorized, performed, and tracked.
KDOT’s architecture was based on a value chain model that represents the agency’s business components and relationships. It included a set of “context diagrams” showing information flow across systems and actors for major subject areas including highway asset systems, long range planning, pre-construction, construction and maintenance. While an architecture does require significant effort to create and maintain, it provides a more global and stable view of business processes and information needs than what would be produced through a piecemeal, incremental approach to system upgrades and replacements. This view can be used to plan the path from the existing set of systems to a more efficient and integrated set.
Kansas DOT Value Chain Framework
Source: Adapted from Kansas Department of Transportation. 2003. Enterprise IT Architecture
Integrating Asset Information Across the Life Cycle
As assets are designed, created, maintained, restored, and replaced, different systems are typically used to keep records of asset characteristics, conditions and work. Ideally, information created at one stage of the asset life cycle is made available for use at the next stage. Techniques, tools and processes are available to manage data for an asset over its entire life cycle from construction or acquisition to disposal.
Integrating information across the transportation infrastructure life cycle is an area of significant interest in the transportation industry. Several terms have been used to describe the collection of processes, standards and technologies for accomplishing such integration – including Civil Integrated Management (CIM) and Building Information Modeling (BIM) for Infrastructure. In 2019 ISO issued its first BIM standard, ISO Standard 19650. This builds on an earlier standard published by the British Standards Institute (BSI).
Traditionally, information created at one phase of the life cycle is archived and not made available to downstream processes. There are substantial opportunities for cost savings by using a shared, electronic model of the infrastructure, defining information needs at each life cycle phase, and establishing procedures for information handoffs across the life cycle. For example, information about assets included in a construction project can be compiled during design and linked to the model representations of the assets. This information can be confirmed and corrected during construction and made available to asset management systems when the project is completed and turned over to maintenance and operation.
Such integration can reduce duplicative data collection efforts, and speed the time required to make decisions and perform work. Implementing these techniques requires much more than adoption of technology supporting 2D and 3D models. A commitment to common standards and processes is needed. Recognizing that this scale of change takes time, maturity models and levels of implementation have been defined to guide agencies in developing roadmaps for enhancing life cycle information integration over time. See the references at the end of this chapter for further information.
Figure 7.3 Integrated Workflow Model for Sharing Information Across the Life Cycle Components
Transportation Research Board. 2016. Civil Integrated Management (CIM) for Departments of Transportation, Volume 1: Guidebook. https://www.nap.edu/read/23697/chapter/5#16
Crossrail is a major design-build project to construct a new railway line across central London (UK). It includes 42 km of track and 10 new underground stations. Project construction began in 2009. The project is being delivered by Crossrail Limited (CRL), currently a wholly owned subsidiary of Transport for London (TfL). Once the project is complete it will be operated by TfL as the Elizabeth Line. The Crossrail project provides a good example of the application of several BIM elements. Early on, CRL established the following objective:
To set a world-class standard in creating and managing data for constructing, operating and maintaining railways by:
- Exploiting the use of BIM by Crossrail, contractors and suppliers
- Adoption of Crossrail information into future infrastructure management (IM) and operator systems
CRL established a Common Data Environment (CDE) with integrated information about the project and the assets it includes. This environment included CAD models, separate linked databases containing asset details, GIS data, and specialized applications for scheduling, risk management and cost management. Data warehousing techniques were used to combine and display integrated information. Considerable work went into defining asset data requirements and setting up standard, well documented data structures and workflows to provide an orderly flow of information from design through construction, and on to maintenance and operation. It was essential to create a common information architecture given that work on each of Crossrail’s nine stations was conducted by different teams, each consisting of multiple contractors. Each station was comprised of over 15,000 individual assets.
Key elements of the approach included:
- A common asset information database with standard templates for deliverables. This database serves as the “master data source from which playlists of information can be created.”
- An asset breakdown structure (ABS) that relates facilities (e.g. stations) to functional units (e.g. retaining walls) to individual assets (e.g. steel piles).
- Asset naming, identification and labeling standards that distinguish functional duty requirements (e.g. a pump is needed here) from specific equipment in place fulfilling these requirements.
- Asset data dictionary definition documents (AD4s) that lay out the specific attributes to be associated with different types of assets, based on the ABS.
- Sourcing of the asset data from design and as-built information.
- A Project Information Handover Procedure specifying the methods of data and information handover for maintenance and operations once the construction has been completed.
- Use of a common projected coordinate system for CAD and GIS data
- Use of a federated data model in which information was maintained within separate special purpose systems, with a common master data model enabling sharing and interpretation of data from the different sources. The master model included elements such as time periods, budget and schedule versions, organizations, data owners, contractors, milestones and key events.
BIM Lifecycle Information Management
Source: Adapted from Crossrail. 2016. Building A Spatial Data Infrastructure For Crossrail. https://learninglegacy.crossrail.co.uk/documents/building-a-spatial-infrastructure-for-crossrail/
Deciding What Data to Collect
Many organizations have recognized that data should be viewed as an asset. Before acquiring new data, it is important to establish a clear statement of how the data will be used and what value it is expected to provide.
Deciding what data to collect involves identifying information needs, estimating the full costs of obtaining and managing new data and keeping it up to date, and then determining whether the cost is justified. Just as agencies don’t have unlimited resources to repair and replace their assets, there are also limitations on resources for data collection and management.
A 2007 World Bank Study summarized three guiding principles for deciding what data to collect:
- Collect only the data you need;
- Collect data to the lowest level of detail sufficient to make appropriate decisions; and
- Collect data only when they are needed.
Chapter 6 can be used to help identify the information needed to track the state of the assets and investments to maintain and improve them. The basic questions one needs to answer to identify needed data are:
- What decisions do we need to make and what questions do we need to answer that require asset data? Typically, an organization needs to be able to answer questions including but not limited to its asset inventory, the conditions and performance of the inventory, and how resources are being spent on its assets. Also, an organization needs to determine what work is needed and how much that work will cost.
- What specific data items are required or desired? Next, one must identify the data required to meet the established information needs. There may be other data items that are not strictly required, but that may be useful if collected in conjunction with the required data. For instance, answering questions and making decisions regarding pavement an organization would typically want to have an inventory of existing pavement, details on paving materials used, and details on current conditions. Additional information on treatment history or substructure conditions might not be strictly required, but if available could enhance the decision-making process.
It is also important to incorporate standard data elements for location and asset identification into requirements, ensuring consistency with other asset data in the agency.
- What value will each data item provide? It is important to distinguish “nice to have” items from those that will clearly add significant value. The cost of collecting and maintaining a data element should be compared with the potential cost savings from improved decisions to be made based on the element. Cost savings may be due to asset life extension, improved safety, reduced travel time, or internal agency efficiencies. In addition, proxy measures for information value can be considered such as the number and type of anticipated users, and the number and type of agency business processes to be impacted.
- What level of detail is required in the data? Level of detail is an issue for all assets, but is particularly an issue for linear assets such as pavement, where one may decide to capture data at any level of detail. For instance, to comply with Federal reporting requirements for pavement condition a state must collect distress data at 1/10 mile intervals for one lane of a road (typically the outside line in the predominant direction). For other applications it may be necessary to collect data for additional lanes, or at some other interval.
- What level of accuracy is needed? The degree of accuracy in the data may have a significant impact on the data collection cost and required update frequency. Ultimately the degree of accuracy required in the data is a function of how the data are used. For instance, for estimating the clearances under the bridge for the purpose of performing a bridge inspection it may be sufficient to estimate the clearance at lowest point to the nearest inch using video imagery. However, more accurate data may be required when routing an oversize vehicle or planning work for a bridge or a roadway underneath it. If a high degree of accuracy is not required it may be feasible to use sampling strategies to estimate overall conditions from data collected on a subset of assets.
- How often should data be updated? Is the data collection a one-time effort, or will the data need to be updated over time? If data will need to be updated should the updates occur annually, over a period of multiple years, or as work is performed on an asset?
Table 7.2 below illustrates examples of data collection strategies that might address different information needs.
7.2 Example Data Collection Strategies
|Example Asset(s)||Type of Information||Example Decisions||Example Data Collection Strategies|
|Pavement Markings||Total asset quantity by type, district, and corridor or subnetwork||Budgeting for assets maintained cyclically||Estimation based on sampling
Full inventory every 3-5 years with interim updates based on new asset installation
|Roadside Signs||Inventory of individual assets – location and type||Work planning and scheduling for assets maintained cyclically|
|Full inventory every 3-5 years with interim updates based on new asset installation|
|Guardrail||Inventory + General Condition (e.g. pass/fail or good-fair-poor)||Work planning and scheduling for assets maintained based on condition||Inventory and condition assessment every 2-3 years
Inventory and continuous monitoring (e.g. from maintenance crews or automated detection)
|Bridges||Inventory + Detailed Condition||Treatment optimization for major, long life cycle assets||Inventory and condition assessment every 1-2 years + continuous monitoring (e.g. strain gages on bridges)|
Once a general approach has been established, more detailed planning for what data elements to collect is needed. Prior to selecting data elements, identify the intended users and uses for the data, keeping in mind that there may be several different uses for a given data set. Identify some specific scenarios describing people who will use the information, and then validate these scenarios by involving internal stakeholders.
One common pitfall in identifying information needs is failing to distinguish requirements for network level and project level data. While advances in data collection technology make it feasible to collect highly detailed and accurate information, it is not generally cost-effective to gather and maintain the level of information required for project design for an entire network of assets.
A second pitfall is failing to consider the ongoing costs of updating data. The data update cycle can have a dramatic impact on data maintenance costs. Update cycles should be based both on business needs for data currency and how frequently information is likely to change. For example, asset inventory data is relatively static, but condition data may change on a year-to-year basis.
A third common pitfall is taking an asset-by-asset approach rather than a systems approach in planning for both asset data collection as well as downstream management of asset information.
Even when there is a strong business case for data collection, it is sometimes necessary to prioritize what data are collected given budget and staffing constraints. Some agencies do this by establishing different “tiers” of assets. For example:
- Tier 1: Assets with high replacement values and substantial potential cost savings from life cycle management (such as pavements and bridges)
- Tier 2: Assets that must be inventoried and assessed to meet legal obligations (such as ADA ramps, stormwater management features)
- Tier 3: Assets with high to moderate likelihood and consequences of failure (such as traffic signals, unstable slopes, high mast lighting and sign structures)
- Tier 4: Other assets that would benefit from a managed approach to budgeting and work planning (such as roadside signs, pipes and drains)
While updating data can be expensive, various strategies are available for combining data collection activities to reduce the incremental cost of collecting additional data. For instance, one approach to collecting data on traffic signal systems is to update the data when personnel perform routine maintenance work. Also, in some cases data can be extracted from a video log captured as part of the pavement data collection process.
Given limited resources for data collection, it may be helpful to formally assess the return on investment from data collection or prioritize competing data collection initiatives. A formal assessment may be of particular value when considering whether the additional benefits from collecting additional data using a new approach justify the data collection cost. NCHRP Report 866 details the steps for calculating the return on investment (ROI) from asset management system and process improvements, including asset data collection initiatives.
- Tier 1: Assets with high replacement values and substantial potential cost savings from life cycle management (e.g. pavements and bridges)
- Tier 2: Assets that must be inventoried and assessed to meet legal obligations (e.g. ADA ramps, stormwater management features)
- Tier 3: Assets with high to moderate likelihood and consequences of failure (e.g. traffic signals, unstable slopes, high mast lighting, sign structures)
- Tier 4: Other assets that would benefit from a managed approach to budgeting and work planning (e.g roadside signs, pipes and drains)
How to Collect Data
As technology continues to advance there are more methods available for collecting data related to assets. It is important for agencies to understand the technology and options available for data collection. Depending on the asset-type or data needed, a different data collection approach may be preferable. This section provides information on making that decision.
There are many different approaches to collecting asset and related data. Often a mix of approaches is used, including visual inspection, semi-automated and automated approaches. The technologies for data collection are advancing rapidly, allowing for increased use of semi-automated and automated approaches for collecting more accurate data at a lower cost. Examples of recent innovations include:
- Improvements in machine vision that allow extracting some forms of asset inventory data from video or LiDAR.
- Use of unmanned aerial vehicles (UAV, also called drones) for allowing bridge inspectors to obtain video of hard-to-reach areas of a bridge.
- Improvements in non-destructive evaluation (NDE), allowing for greater use of techniques such as ground penetrating radar (GPR) for pavement and bridge decks and instrumenting bridges to monitor performance over time.
- Improvements in hand-held devices allowing for increased field use, reducing cost and time of manual data collection.
Several of these technologies provide opportunities to save money by collecting data for multiple assets within a single collection effort. Table 7.3 provides a summary of potential data collection approaches for common roadway asset classes.
Table 7.3 – Example Data Collection Approaches
|Asset Class||Data Collection Method||Data Collected||Notes|
|Pavement||Visual Inspection||Present Serviceability Index (PSI)||Often used in urban environments or for small networks where data collection using automated collection approaches is impractical – can be supplemented by UAVs|
|Pavement||Automated data collection vehicle with laser scanning system||roughness, cracking, nutting||Includes a range of 2D video and 3D laser-based systems. Many systems store video images and can capture additional measures, such as cross slope, gradient and curvature|
|Pavement||Light Detections and Ranging (LiDAR)/ Terrestrial Laser Scanning (TLS)||roughness, cracking, nutting||Provides a high resolution continuous pavement survey. Often inventory data for other assets can be extracted from the data set|
|Pavement||Falling weight deflectometer||strength/deflection|
|Pavement||Locked wheel tester/spin up tester||skid resistance|
|Pavement||Ground Penetrating Radar (GPR)||layer thicknesses, detection of voids and crack depth|
|Pavement||Coring||layer thicknesses, detection of voids and crack depth|
|Pavement||Smart phones||potholes, roughness||Includes systems for reporting of potholes and measuring roughness through crowdsourcing|
|Structures and Bridge||Sensors||inventory, condition ratings||Strain and displacement gauges; wired or wireless,|
|Structures and Bridge||Unmanned Aerial Vehicles (UAVs)||condition of non-bridge struc- tures (e.g. retaining walls)|
|Structures and Bridge||LiDAR||Vertical Clearance|
|Structures and Bridge||Visual||inventory, condition ratings||Can be supplemented using UAV and other technologies|
|Structures and Bridge||Acoustical (e.g., impact echo)||delamination, corrosion|
|Structures and Bridge||Infrared/ Thermal Imaging||delamination, corrosion|
|Structures and Bridge||GPR||concrete deck condition|
|Structures and Bridge||Half Cell Potential Test||concrete deck condition|
|Traffic Signs||Videolog||inventory, condition ratings||automated or semi-automated techniques available for classification|
|Traffic Signs||Mobile LiDAR||inventory, condition ratings|
|Traffic Signs||Field Inspection – mobile application||inventory, condition ratings|
Once data are collected, it is essential to put in place regular processes for updating the data. This can be accomplished through periodic data collection cycles, or through updating as part of asset project development and maintenance management processes.
Unmanned Aerial Vehicles (UAVs) offer several advantages for asset data collection. They can fly into confined spaces such as entrances to sewers and culverts to collect data and images. They can collect high resolution images, thermal images and LiDAR. LiDAR can be used to produce three dimensional images that allow for accurate measurements. Thermal images can be used to detect subsurface concrete deterioration.
Michigan DOT analyzed the benefits of using UAVs for bridge inspection, and concluded that using a UAV for a deck inspection of a highway bridge reduces personnel costs from $4600 to $250. A traditional inspection would take a full day and require two inspectors, and two traffic control staff to close two lanes of traffic. The same inspection using a UAV take 2 hours and would require only a pilot and a spotter. An additional savings of $14,600 in user delay cost was estimated based on delays associated with shutting down one lane of a four lane, two way highway bridge in a metropolitan area for a bridge inspection.
The Tennessee DOT uses an automated data collection van to collect pavement condition surveys each year in support of its pavement management system. In addition to the pavement sensors, the van also has high definition cameras and LIDAR sensors which scan the roadway and create a 3D model of the environment. As the surveys are conducted, inventory information for approximately 20 highway assets is extracted from photolog and LiDAR information. The inventory from the past data collection cycle is compared to the data collected during the current data collection cycle to determine any changes to asset inventory to keep the data up to date. Tennessee DOT summarizes this inventory data at the county level for planning and budgeting; however, they are currently working toward having the ability to report maintenance work at the asset level in the future. Federal Highway Administration (FHWA). Pending publication 2019. Handbook for Including Ancillary Assets in Transportation Asset Management Programs. FHWA-HIF-19-068. Federal Highway Administration, Washington D.C Federal Highway Administration (FHWA). Pending publication 2019. Handbook for Including Ancillary Assets in Transportation Asset Management Programs. FHWA-HIF-19-068. Federal Highway Administration, Washington D.C.
Preparing for Data Collection
In order to get the most out of the data collection process, it is important for agencies to be thoughtful in the steps leading up to the actual collection of data. Three important steps to prepare for data collection include: coordinating with stakeholders, specifying exactly what data will be collected, and training staff to collect the data.
Once an organization has determined what data to collect and how to best to collect it, the next step is to prepare for data collection.
Step 1. Coordinate
On important step prior to collecting data is to coordinate with other stakeholders in the organization concerning the data collection effort. It may be possible, through such coordination, to identify opportunities for coordinating data collection activities to reduce costs. Alternatively, other stakeholders may identify needs for collecting related data to address other needs. Another possibility is that a different business unit in the organization has already collected data that may impact the data collection plan.
Step 2. Specify
In this step one must identify exactly what data will be collected, the means used to collect the data, and who will collect the data. If data collection is being outsourced, at this point it is necessary to establish contract specifications for data collection.
Also as part of this step one should establish the approach for quality assurance (QA)/quality control (QC). A QA/QC plan specifies the desired accuracy of the data to be collected, and describes the measures used to assure data are of the specific level of accuracy, review data quality as data are acquired, and address any data quality issues that arise. If data are collected using automated means, the plan should specify the approach for calibrating any measurement devices used for data collection. If data are collected through visual inspection the plan should detail training requirements.
Note that given data QA/QC is an area of particular concern for pavement condition data collection, given the expense involved in collecting this data and increased reliance on automated data collection techniques. The Federal performance management requirements described previously include a requirement for State DOTs to establish a QA/QC plan for pavement data collection.
Step 3. Contract
This step involves determining whether to outsource data collection and to contract for services if applicable. Decisions to outsource are typically made to tap into a vendor with specialized equipment and experience with a particular data collection technique, and to enable accomplishing a major collection effort within a compressed timeframe, which would not be possible using internal staff resources. Some agencies may implement a hybrid approach, hiring a contractor while using internal staff (or a separate independent contractor) for supervisory or QA functions.
Step 4. Train
The last step prior to collecting data is to train the staff involved in data collection and review in how data collection should be performed, as well as in their specific roles and responsibilities. Training is important for any data collection effort, but is particularly important in cases where the collection effort relies on visual inspection (for inspecting bridges). In these cases, the training requirements for inspectors should be carefully established and implemented. Even where there are no formal requirements for inspectors, it can be highly valuable to assemble inspectors prior to the start of data collection to review the data to be collected, walk through the data collection process, and perform inspections in a test scenario to ensure consistent interpretation of condition assessment language and other areas where differences in human judgement may impact how data are collected.
Once these steps have been performed the next step is to collect data, following the approach established in Step 2 for data collection and QA/QC.
Utah DOT started capturing LiDAR data for multiple assets in 2011. Several different business units within the agency provided funding for the effort, which has included collection of inventory data for bridges, walls, signs, signals, barriers, power poles, striping, curb cuts, drainage, shoulders and ATMS devices – as well as pavement condition and roadway geometrics. UDOT has leveraged this integrated pool of asset data for several different applications, including one which creates a draft cost estimate for asset installation for project scoping, based on existing inventory.
There are many different ways to share information about assets, condition, performance, needs, and work. Agencies can select multiple distribution channels to serve both internal and external users.
As with the design of reports and visualizations, designing a data sharing strategy should begin with an understanding of the different audiences for data and their needs. A variety of options for data sharing are available that can be employed. Table 7.5 outlines some of these options and suggests some questions to consider in selecting an appropriate option.
It is helpful to establish guiding principles for data sharing in order to achieve a consistent agency approach that provides maximum benefits in a cost-effective manner. Possible principles include:
- By default, data should be shared unless it is sensitive, protected by law or if sharing it would pose unacceptable risks or cost burdens
- Self-service methods of data sharing should be used when there is a relatively large pool of data users and data limitations can be readily communicated via standard metadata
- Avoid proliferation of single purpose data sharing applications by adopting standard platforms where multiple data sets can be shared
- When it is necessary to share the same data set through multiple channels, the source data should be stored in a single location or a single data refresh process should be used to reflect updates
- The process of preparing data for sharing, reporting and visualization should be governed to ensure quality, ensure adequate documentation, and avoid inconsistency
Table 7.5 Data Sharing Options
|Data Sharing Option||Most appropriate for…||Considerations|
|On request||Internal or external data users||Use for uncommon, specialized requests requiring moderate to extensive effort to fulfill or where there is high potential for information misinterpretation or mis-use
For common information needs, use other methods to reduce staff time spent on fulfilling information requests.
|Direct access to specialized asset management system (e.g. for pavement, bridges, culverts, etc.)||Asset and maintenance specialists in the central office and field offices||Helpful features include: ability to provide view-only privileges and ability to provide filtered views of information (e.g. restrict to a single district)|
|Direct access to enterprise asset management system (with information about multiple assets)||Agency staff|
Partner agency staff (e.g. MPOs, localities)
|For partner agency access, ability to provide access outside of the agency firewall is needed.|
|Enterprise GIS with spatial open data portal||Internal or external data users||It is best to design separate maps geared to specific user types
May want to separate internal and external portals or restrict some specialized maps for internal use.
|General open data portal||Internal or external data users||Consider using available federal and state-level open data portals
May want to separate internal and external portals or restrict some specialized maps for internal use.
|Data feeds/data services/Automated Programming Interfaces (APIs)||Internal or external data users||Most suitable for real time data sets, data sets that are frequently updated, and complex data sets where flexible querying options are needed.|
|Data warehouse/data mart||Agency staff||Use to create a cleansed and standardized data source for reporting/business intelligence.
Particularly helpful when historical/time series data is required, and direct access to data from source systems is problematic due to data quality, consistency or performance concerns.
Tabular data within the Data Warehouse can be joined with spatial data, as needed, within the Enterprise GIS.
|Data lake||Agency data analysts/data scientists||Use to provide access to a heterogeneous collection of data including “big data” and unstructured data for research, modeling and analysis.|
|Content management system||Agency staff and partners (e.g. contractors)||Use to provide access to a curated collection of content including engineering design drawings, asset maintenance manuals, contracts, etc.|
|Common data environment (CDE)||Agency staff and partners (e.g. contractors)||Use to provide a shared information repository for a construction project. CDEs typically include document management, collaboration and workflow features. CDE is one of the key elements of BIM practice defined by the UK’s Construction Industry Council.|
Washington, DC has established four levels of data. By default, data is considered to be open and sharable.
- Level 0. Open (the default classification)
- Level 1. Public, Not Proactively Released (due to potential litigation risk or administrative burden)
- Level 2. For District Government Use (exempt from the Freedom of Information Act but not confidential and of value within the agency)
- Level 3. Confidential (sensitive or restricted from disclosure)
- Level 4. Restricted Confidential (unauthorized disclosure can result in major damage or injury)
Vermont Agency of Transportation (VTrans)
VTrans shares their data with the public through the VTransparency Public Information Portal. The goal of the portal is to “turn data into useful information for our customers” and to “create tools for getting answers to some of the questions we get most often”. The VTransparency Portal features different tools for viewing specific data. These tools include:
- Projects Map
- Road Conditions
- Plow Finder
- Weather Cams
- Maintenance Districts
- Crash Fatality Report
- Crash Query Tool
- Find a Project
- Daily Traffic
- Highway Closures
- Bridge Inspections
- Pavement Conditions
- Pavement Performance
- Maintenance Work
- Rail Asset Inventory
- Rail Bridge Inspections
- Rail Clearance
- Rail X-ing Inspections
The VTransparency Portal also links to the Vermont Open GeoData Portal. This provides GIS map layers related to the various tools for people interested in doing their own analysis of VTrans data. VTrans holds to the principle of making data available by default unless it is sensitive. The agency values transparency with the public and welcomes feedback on the tools they’ve developed. The VTransparency Portal can be accessed at https://vtrans.vermont.gov/vtransparency
Preparing Data for Sharing, Reporting and Visualization
Establishing a standard process to prepare data for sharing, reporting and visualization can make sure that data is publication-ready: quality checked, tested and documented.
A standard data preparation process should be used before moving data to any official reporting source – whether it is a data warehouse, a geodatabase, or a file uploaded to an open data portal.
A data preparation process might use the following checklist:
- Is the data derived from a designated authoritative source system?
- Have data quality checks been applied?
- Has metadata for the data set been prepared, including explanation of the data source, date of last update?
- Is an individual or business unit identified for data users to contact for further information?
- Is an individual or business unit identified for reporting database or system managers to contact regarding any issues that arise?
- Has metadata for the data elements included been prepared (data dictionary)?
- Has the metadata been reviewed for completeness and quality?
- Has a data owner or steward signed off on the data publication?
“Data-driven decision making is an approach to business governance or operations which values decisions supported with verifiable data. The success of the data-driven approach is reliant upon the quality of the data gathered and the effectiveness of its analysis and interpretation”
Fundamental Concepts and Principles
Data governance and management practices are essential for achieving reliable, consistent, integrated and accessible data that is of value for decision-making. Several definitions, concepts and principles are important to understand before embarking on a data governance initiative.
Data governance and data management are interrelated but distinct practices.
Data management includes activities such as data quality management, data documentation, metadata management, security and access controls, data integration, and data archiving.
Data governance is a policy making and oversight function for data management. Implementing data governance involves forming and chartering decision making bodies, defining roles and responsibilities, establishing policies that set expectations for behavior, and setting up standard processes for things like approving data standards, resolving data issues, and acquiring new types of data. Data governance is generally implemented in a hierarchical fashion, with an executive body at the top, a data council or board in the middle, and several more focused groups oriented around specific systems, business processes, organizational units or functions.
Data stewardship is closely related to data management and governance. It refers to established responsibilities and accountabilities for managing data. In general parlance, a steward is someone who is entrusted with the responsibility for taking care of someone else’s property. Similarly, a data steward is someone who takes care of data on behalf of their agency. Different types of stewardship roles can be defined and formalized within an agency data governance policy. Data stewardship can be viewed as the way to operationalize data governance policies, processes and standards.
Data governance can be implemented to:
- Improve quality and consistency of data
- Ensure coordination across different business units
- Maximize efficiency in data collection and management processes
- Enable data integration and shared solutions to make the most of available IT resources
- Ensure there is a solid business case for new data collection
- Ensure that data will be maintained once it is collected
Agencies may be motivated to establish a formal data governance function as they try to move from a siloed approach to collecting and managing data to one that is more coordinated and centralized.
For example, implementing a reporting system that takes data from multiple sources within the agency creates the need for standardization, documentation, and agreed-upon update cycles. It is important to get agreement on standard data definitions, formats and code lists from different business units to achieve consistency. It is also important to clarify who is responsible for fixing errors and the process for error correction in the even that errors occur.
Data governance is a means to an end. It is important to clearly define and communicate why an agency needs to strengthen data governance: what is happening now that the agency may want to avoid (such as data duplication)? What is not happening now that the agency may want to achieve (such as standardized data)? The effort involved in putting data governance in place should not be underestimated, since it involves changes in how decisions are made and changes in behavior. A full scale agency data governance model can take years to mature. However, data governance can be rolled out incrementally to focus on short term objectives. It is a good idea to adopt a set of principles to provide the foundation for data governance policies and practices. The AASHTO Data Principles (see callout box) can be used as a model.
Florida Department of Transportation (FDOT) launched a statewide initiative to better manage and integrate agency data. This effort combines the resources, goals, and objectives of Florida’s Technology and Operation Divisions into the initiative known as ROADS, which stands for:
R—Reliable, accurate, authoritative, accessible data
O—Organized data that produces actionable information
A—Accurate governance-produced data
D—Data and technology integration
S—Shared agency data to perform cross-functional analysis
The agency has created processes, procedures, and guidelines so that all data (financial, safety, project, program, assets, etc.) are organized and accessible. Florida’s steering committee, known as RET (ROADS Executive Team), is led by the agency’s Chief of Transportation Technology and Civil Integrated Management Officer. The committee, which includes district secretaries, financial and planning executives, and operational directors, is charged with governance leadership and instituting processes that will change the culture of the agency by converting data to knowledge.
ROADS is being implemented incrementally, through a series of 6-month initiatives. One initiative related to asset management is to standardize inventory attributes for 120 different classes of infrastructure assets and the agency’s approximately 170 enterprise software applications. Part of this effort is to determine specific authoritative source data to include in a new data warehouse. The data warehouse will provide a single authoritative site for sharing the accurate data.
Through the ROADS initiatives, Florida DOT has created a strategic direction for data integration covering data stewards, division responsibilities, asset inventory, business system integration, and an implementation roadmap. By coordinating its efforts, the agency is able to maximize the value of its data while streamlining processes for data collection, management, and dissemination.
Florida DOT Enterprise Information Management
Source: Florida DOT. 2019
Data Governance Practices Supporting TAM
Data governance practices can be implemented to support development of a valuable, reliable base of integrated information for TAM decision making.
A first step in data governance is to identify key decision points to be governed. These may include:
- Adopting common data definitions or standard code lists
- Adopting location referencing standards
- Adopting standard tools for field data collection
- Collecting new asset data to be included within an integrated asset management system
- Archiving or deleting existing data
- Modifying data elements for an existing TAM data source
- Adding new data layers to an enterprise GIS repository
- Adding new data marts to a data warehouse
- Adding new reports or controls to a BI environment
- Responding to an external request for data
It is best to take an incremental approach to setting up governance processes, starting with a few high impact areas that are aligned with what the agency is trying to achieve. For each of the selected decisions to be governed, think both about the criteria or guidelines to be followed as well as all the people who should be consulted or involved in making the decision.
- Criteria and Guidelines: Developing guidelines for key decisions is a good way to institutionalize practices that reflect the agency’s goals for data. For example, some agencies have established “readiness checklists” that need to be completed before data can be added to an enterprise repository. These ensure (among other things) that a data owner or point of contact has been identified, that necessary metadata is provided, that a refresh cycle has been specified, and that the authoritative source system of record has been identified.
Decision Making Process: Consider who should be involved in each of these decisions – who is responsible for making technical recommendations, who should be consulted, who has approval authority, and who needs to be informed about the decision. Define a process for resolving issues and conflicts; and a process for granting exceptions to established standards.
Agency data governance bodies can be responsible for adopting both guidelines and process flows impacting decisions that impact multiple business functions. If there are no existing governance bodies or if decisions to be governed are specific to TAM, a separate TAM data governance group can be established.
Keep in mind that the function of governance bodies is to make decisions. Use technical advisory groups, working groups or communities of interest to do the collaborative work required to develop standards or make recommendations about changes to data and systems.
Ohio DOT has established a standard process for adding a new asset to their inventory. As illustrated in the flowchart below, the process has three stages – (1) Asset Overview, where the request is submitted, evaluated, and approved, (2) Requirements, in which business and technical requirements for collecting and managing the new data are documented, and (3) Application Development, where the technology solution is developed either in-house (using standard tools), via contract (for custom development) or through acquisition of a commercial off-the-shelf (COTS) package.
As part of the TAM Audit Group workflow shown in the figure, ODOT has introduced over 693,000 active ancillary assets into their inventory.
Ohio DOT TAM Audit Group Workflow Diagram
Source: Ohio DOT. 2019
Assessing Data Management and Governance Maturity
Data management and governance implementation can be viewed as a long term process of maturation. Several models and assessment tools are available to help agencies identify their current state, set goals for where they want to be, and create plans for moving up the maturity scale.
There are several different assessment tools tailored to DOT data programs that can be used or adapted as needed. In addition, several DOTs have created their own tools. Most of these tools are based on a maturity model.
A typical maturity model could include the following levels:
- Level 1-Initial
- Level 2-Repeatable processes
- Level 3-Defined and documented processes
- Level 4-Measured and managed processes
- Level 5-Optimizing processes (continuous improvement)
For TAM information and systems, maturity levels can be assigned to different aspects of data management and governance. Assessments can also be conducted at different levels of the organization – from the agency-wide level, to the level of individual information systems (or even data elements).
Table 7.6 shows the data management and information system-related assessment elements from the TAM Gap Analysis Tool, developed under NCHRP Project 08-90. Figure 7.3 illustrates the data assessment guidance created under NCHRP 08-92. This process is suitable for application either at the agency-wide level, for an individual data program, or for a business process. It goes into greater depth than the TAM Gap Analysis Tool.
Table 7.6 – TAM Analysis Tool Assessment Elements
|Element||Sub-element||Sample Assessment Criteria|
|Data Management||Asset Inventory|
|Asset Condition and Performance||
|Information Systems||System Technology and Integration||
Iowa DOT conducted a detailed data maturity assessment for over 180 data systems. Assessments were based on a standardized questionnaire administered to data stewards and custodians. The questions covered data quality, availability of metadata, whether a data retention plan was in place, the degree to which data collection was automated, and several other factors. Charts were produced showing maturity scores for each system, with roll-ups at the division level. This tool helps the agency track their progress over time and identify specific data improvements to pursue.
Sample Data Assessment Summary Radar Chart
Source: Iowa DOT. 2019
How To Guides
Develop a Risk Register
Manage Change and Prepare for a System Replacement
Determine What Data Is Needed to Support Life Cycle Management
Implement a Multi-Objective Decision Analysis (MODA) Approach
Characteristics of Strong Performance Measures for Managing the Condition of Ancillary Assets
Monitoring External Considerations in Risk
Risk Management Process
Data Items to Standardize for TAM
Asset Data Collection Readiness Checklist