Project: DataPipe
DataPipe will develop a platform, toolkit and pipeline for the intelligent, rule-based selection, management, analysis, publishing and display of heterogeneous multimodal data in the oil and gas sector. It will create a flexible system to provide web-based visualisation and decision support based on the analysis of extremely large datasets. The platform will be extensible to big data mining, analysis and display in a wide range of industrial and commercial sectors._x000D_The oil and gas exploration & production sector faces a huge data management and analysis problem. A single seismic survey can generate tens or even hundreds of petabytes. Seismic data may be complemented by aerial or satellite photography and gravitational measures. The data is noisy and has to be cleaned and aligned before a three-dimensional model can be synthesised and visualised. The process is computationally demanding; interpretation leans heavily on human skill and experience. _x000D_A large seismic survey can cost many tens of millions of dollars. If an area looks promising, a test borehole will be drilled, which will generate large volumes of data with positional, radioactivity, temperature, porosity, resistivity and other measures that enhance the geological model. Improvements in analytic and drilling techniques as well as shifts in the global economy can change decisions, so survey data have a long shelf life. A deposit that was once uneconomic may require a new analysis and interpretation thirty years later. There are problems of COtaining, tracking and accessing very large data volumes for decades, finding and mining old data from different silos, reading and interpreting different media formats and file types. _x000D_Data on this scale may require high-performance computing. Cloud services are starting to offer the performance, but questions reCO about data input, accessibility, interfacing and security. The Cloud also facilitates combining data from multiple sources and optimising the trade-off between speed and cost. Companies want analytics on demand, in a web interface, on a range of devices. Hardware, software and networks have matured so that it is possible to provide data interactions in the field, using smart phones, tablets and laptops. However, tailoring data to the device and screen type involves more than changing the layout: an operative using a laptop may have different information needs from one on a smartphone or tablet. Current methods of managing, analysing and displaying oil and gas exploration data are extremely labour intensive, and require specialist case-by-case customisation. DataPipe will apply intelligent technologies to automate the data workflow, with decision support for workers at different levels. Customers will range from the smaller exploration company to the global corporation._x000D_The project Objectives are: _x000D_1. Analyse and model the processes of collecting, cleaning, aligning, modelling and visualising data from heterogeneous multimodal oil and gas exploration data stores, with different APIs, data and file types._x000D_2. Research and develop rules engines for (a) intelligently addressing and selecting the data required from different industrial data spaces and (b) selecting the appropriate processing and analytics._x000D_3. Model the decision-making process and develop an expert system to help operators interpret the data analytics and make decisions about further exploration and operations_x000D_4. Develop software tools and processes based on the concept of the ‘job ticket’ to manage the pipeline for processing and electronic publication of the data, and to access services such as asset management, data cleaning, temporal and spatial alignment of datasets, data integration, model building, visualisation and analysis._x000D_5. Develop a ‘DataAgent’ processing platform to manage and execute DataPipe operations, combining job tickets and rules engines, optimising data flows and balance the load between local devices and Cloud processing._x000D_6. Develop means of publishing and displaying the information, with decision support tools, in formats tailored to the user needs and the different characteristics of web-enabled computers, tablets and smart phones. _x000D_7. Test and evaluate the platform, tools and pipeline with a realistically large and heterogeneous corpus of oil and gas exploration data._x000D_8. Define exploitation plans, including a roadmap for the developments required to extend the DataPipe platform to other industrial and commercial sectors._x000D_The Consortium comprises four research-performing SMEs. Dalim (D) has over 25 years experience of creating workflow software for the print and media industries, and is a pioneer of JDF. Ovation (UK) is a leading specialist in lifecycle data management for the oil and gas industries. Root6 (UK) specialises in pipeline management and format conversion. Actimage (F) provides intelligent information systems for big data, combining skills in data management, analytics and cross-platform interfaces.
Acronym
|
DataPipe
(Reference Number: 8503)
|
Duration
|
01/01/2014 - 31/12/2015
|
Project Topic
|
DataPipe will develop a platform, toolkit and pipeline for the intelligent, rule-based selection, management, analysis, publishing and display of heterogeneous multimodal data in the oil and gas sector. It will create a Cloud-enabled system with web-based visualisation and decision support.
|
Network
|
Eurostars
|
Call
|
Eurostars Cut-Off 10
|
Project partner