The development of remote operational centers has improved data quality due to the increased focus on data acquisition and real-time usage of data. Data quality has different content to various participants in the oilfield. Data streaming and continuous flow of data is main focus for the technical part in delivering and receiving data. Scientist are focusing on accuracy of each data point. Most surface sensors are measuring milliamps with calibration for accuracy. Reservoir measurements have the same issue, porosity and permeability as the main reservoir properties are not measured directly and derived from other sources. In automation processes data streaming need to be flawless. Through remote operations the surveillance of data streaming quality and accuracy is performed. In most areas the response time on the network is too narrow to stream data from the well to a remote location and then stream a command or solution back for full closed loop control. . During the acquisition, aggregation, distribution and finally visualization there are room for changes in the data point itself. These changes are from uncertainty, stacking, filtration, unpacking, transmission, and any other data handling process for data sharing. For the digital oil field data are evaluated in two different settings, real-time and after the event. The interpretation is done either remotely or at the well site. There is room for improvement in all areas depending on objectives in the process: - Automation in the operational phase - Interpretation based on a model update - Automated quality control This paper illustrates the differences and similarities between real-time operations and processes done to the data later and how a combination with local and remote operations is enhancing data quality. We will follow up by making improvement suggestions in all areas moving into the digital oil field.
Number of Pages
Looking for more?
Some of the OnePetro partner societies have developed subject- specific wikis that may help.