Access to accurate, reliable data in the oil and gas industry enables producers to understand and manage their production more effectively. But what does this new data look like? And how should it be analysed?
Wellhead analysis is undergoing a revolution. The industry is moving from discrete distributed data points that represent a snapshot of performance, and aggregated data that hides granular change, to sophisticated data analysis techniques.
At the wellhead this is represented by an increasing collection of sophisticated real-time multiphase measurement data to evaluate and optimise, instead of 24-hour averages used monthly to report. However, when it comes to multiphase measurement systems, the way data is captured and curated can be key to adding value. State-of-the-art data systems will deliver better informed decisions, significant cost savings and health and safety benefits, all while making the industry more attractive to a new workforce.
Even in remote environments, communications technology and data management techniques allow access to detailed understanding of well behaviour. The old model in which data was aggregated and delivery delayed as it passed from site readings to manual spreadsheet entry is being rapidly superseded. As that change takes place it’s logical to rethink the processes we use to store, interpret, and maximise value of the data.
At M-Flow we’re building a transparent data model that fully discloses the way in which data is logged, processed, and presented. By retaining raw measurements, we allow reprocessing to extract further value and understand sources of uncertainty. However, it also allows the data to be used and presented at different levels of granularity for different management and operational objectives. Well optimisation and monthly reporting systems can be developed on the same basic data set. If asset owners want to see a top line summary of well production, our web portal GUI will provide that. Equally, if engineers want to revisit data interpretation with different assumptions our transparent data processing pathways allow them to intervene at the heart of calculation methodologies without being locked out of a black box.
Distributed data benefits and prerequisites
Using M-Flow meters that use remote log-ins to calibrate, monitor, and diagnose wellhead measurement from the office reduces costs and promotes better health and safety by negating the need to place workers into the field to gather data at the wellhead.
But the prerequisite for this is that the meters can be installed and left running in the harshest environments without needing ongoing physical maintenance or recalibration. M-Flow’s unique composite, non-intrusive construction is the enabler for this step change.
As the environmental impact and HSE footprint for employees goes down, the next generation of skilled oil and gas workers, focussed on digital skills and located away from the oil field can be engaged. Booming areas with skilled labour shortages and often high manpower costs such as West Texas or North Dakota can staff operations to maximise value.
The future of monitoring is 24/7
Moving forward, we can expect exciting areas such as Virtual Flow Metering and predictive models of oil fields to be enhanced by the flood of data. In tomorrow’s oil and gas industry artificial intelligence and machine learning will be the norm in measurement technology. However, you can only apply AI algorithms if you’ve got the base data. Accurate low-cost multiphase measurement enabling 24/7 permanent monitoring is the first step towards achieving this vision. This technological shift is only just beginning to scale, and at M-Flow we’re excited by the challenge it represents.