INSIGHT

How to Implement an Energy Pipeline Data Management Solution

Connect with us

Contact Us

The data management and record keeping requirements for pipeline operators seem to be ever increasing, and are daunting for integrity management professionals. In the past couple of decades, companies within the oil and gas industry have made great strides to improve pipeline safety with the development of integrity management (IM) programs. However, there are still newly constructed pipelines, or pipeline replacements, every year that don’t meet the Pipeline and Hazardous Materials Safety Administration’s (PHMSA) new safety regulations, Mega Rule, for being traceable, verifiable, and complete (TVC) with respect to the requirements posed by the integrity verification (IV) process.

Scott Komarek, a pipeline project manager at Bartlett & West — a pipeline engineering and technology firm, explained, “On more than one occasion I’ve seen that operators’ pipeline safety or integrity management groups are not on the same page as their new capital groups.” Komarek further explained that these two groups sometimes operate completely independent of each other. “In some cases, the integrity management groups may not know about a new pipeline being constructed until the construction records land on their desk. They then scramble to validate records, update their GIS system and get the pipeline into their IM program,” he said.

This creates a major risk of missing or conflicting information, and once the pipeline is in the ground it can be difficult or costly to reconcile or validate the data.

Integrate IM into other field efforts

For an IM program and IV process to be successful, collaboration and sharing of data across service groups is critical. Forward thinking operators have implemented best practices for obtaining critical information about newly constructed pipelines by performing “as-built” or “in-the-ditch” surveys as the pipeline is being constructed. Yet, performing an as-built survey alone will not be enough to meet the requirements of the new rule. There needs to be a mechanism to review, check and reconcile data against other simultaneous operations, such as pipe tallies, MTR reports, NDT reports and inspection reports.

Leverage technology and programs

To meet the MEGA Rule requirements posed on operators, Bartlett & West developed a holistic approach to managing construction documentation, which they call a Centralized Data and Quality Management Program. This program is a simple workflow that couples streamlined field procedures with emerging technologies so that the IV process is completed and the TVC requirements are satisfied in real time, and before the pipe gets buried. Another benefit of these processes is that all stakeholders can have access to all construction records in real-time, to help manage expectation and communicate accurate and reliable progress and forecasting.

Komarek said, “Our approach is simple. We work with the operator to clearly understand the data deliverables at the end of the project, and collaborate with the other service groups to centrally organize and effectively manage the construction documentation. The Centralized Data and Quality Management Program will enable you to develop a repeatable method to drive consistency, reliability, completeness and accuracy of the construction records for your organization between projects.”

Ultimately, pipeline operators are having to collect, verify and maintain more and more records to manage risks under the PHMSA’s Mega Rule. By implementing a pipeline data management solution, efforts spent meeting these requirements is more efficient, will reduce cost, and the data is more reliable.

 

Tell us about your project

We’d love to work with you. Tell us the services you are seeking and one of our team members will connect with you.