INSIGHT

Maintaining data quality for Positive Train Control

Connect with us

Contact Us

Part 2

The Part 1 article of this series covered the implementation of Positive Train Control (PTC) and its evolution into a digital twin. It also identified the next step for this digital twin, which is the integration of other business data. A railroad’s PTC data, and the entire digital twin, should be living, breathing data—meaning updated real-time to reflect the current conditions of the system. Ensuring that both the data created for PTC and the data received from the business is high quality can be a complex problem. 

PTC Data Requirements 

PTC has a robust, defined set of criteria. There are mathematical expressions of precision and accuracy for spatial data. There are definition tables of required attributes and their purpose. In the early days of PTC implementation, many people in the industry set internal standards that far exceeded the needs of PTC. This was an effort to “future proof” their work to some degree, but also to meet the needs of other parts of the business as that information isn’t always as uniform or regimented. 

Real-world Example 

For example, a railroad signal is a required feature in PTC. Someone or something needs to collect the location of the feature, which way it faces, which side of the track it’s on and some information that allows the train to query the signal’s state (is the light red?). There is a lot more information the business needs to know about that signal. Discrete information needs to be collected about the mast, head and aspects. Information about its wiring, when it was installed, when it was last maintained and what it’s protecting also needs to be collected. Someone might even overlay some virtual features on the signal, within the digital twin, to facilitate needs in the digital space. “Route calibration points” used in conjunction with a linear referencing system are a prime example of a virtual feature that might need to occupy the same space as the signal in your digital twin. The business-critical question to answer is, “How do you ensure ALL of that information got put in correctly or even got put in at all?” 

A Data Change-Management Solution 

Railroads change more frequently than most people would like. Washouts move track. New switches get cut in. Signs get knocked over in a storm. New capabilities like Internet of Things (IoT) and advances in telemetry systems make streams of near real-time data possible. Existing sensors, like high-water, dragging equipment and hot-box detectors, all need monitoring. The only way to keep a handle on data at this scale and frequency is automation.  

Bartlett & West has employed many different tools for quality assurance and quality control. Off-the-shelf tools from partners like Esri are used to automate and ensure quality in data entry. Custom developed tools for editing PTC data are also used to help manage the complex requirements for spatial accuracy and attribution. Machine learning and artificial intelligence can improve recovery of assets from LiDAR or photos. 

To control quality after the fact by automation, we have a similar array of tools. A variety of back-end database capabilities in Oracle Spatial or Microsoft SQL environments compute values and ensure completeness. Powerful tools like Safe Software FME scan complete or transform large datasets. Using scripting in a variety of languages helps to search for errors, omissions and inconsistencies. 

Looking ahead, we hope to employ solutions like machine learning to help hunt for problems before they occur. This is all done with one goal in mind—striving to make our rail clients’ digital twins more complete. 

Tell us about your project

We’d love to work with you. Tell us the services you are seeking and one of our team members will connect with you.