Selected abstracts from the upcoming SPE APOGCE are presented here below. Register for the conference here: http://www.spe.org/events/en/2016/conference/16apog/registration.html
Managing Abandonment in Australia SPE-182416
Australia is an inexperienced player in field abandonments with early efforts proving complex, high cost, and with last minute approvals due to regulatory and social concerns. Considering uncertainty in the regulations, the industry, NOPSEMA and the Government need to deal with what can be left on the sea floor, NORM-related disposals, handling onshore and transfer of liability. Their duty is to protect the environment, provide the platform for safe operations, maximise value for stakeholders but also avoid the taxpayer bearing increased costs/rebates from inefficient programmes.
On the cost side, many companies are looking to upgrade abandonment capabilities, and take advantage of new technologies and approaches. However, capabilities and experience are short and regulations unclear. Even so, not all will be equal in their diligence and provisions and hence we are likely to see missed opportunities as an industry. Looking at case studies in Gulf Coast, North Sea and Australia tells us there are many lessons learnt and successes that we should be employing from both environmental and cost angles – e.g. most abandonments struggle with unclear regulations and well/facility deterioration and barrier issues. With these challenges and risks, leadership from all stakeholders should come together to better define and select innovative concepts for decommissioning.
Between 2016 and 2030, the overall impact of decommissioning is estimated at $5 billion to $7 billion with the government exposed to up to 60% of this cost through taxes or higher if left with the liability. E&P operators are expected to spend between USD 2 to 3 billion in decommissioning costs and overruns. From a ‘risk of unknowns’ perspective, there is a residual risk of unknown liability in the event of bankruptcies/walkouts. Such circumstances could leave the Government to pick up entire costs of major restorations or excessive project overruns by operators.
E&P Operators can execute quick wins through rig optimisation, contracting and procurement, effectuating >10% reduction in expenditure. Building internal capabilities by standardisation, lean execution best practices, a mature asset strategy and systematic cost estimation methodology could contribute another 20% to cost reduction. Establishing new business models by aggregating industry demand, outsourcing to specialist/salvage companies, technology investments or sharing resources with other operators could contribute another 20% to cost reduction. Any government action such as reducing the number of regulatory bodies for efficient decision making could further reduce costs. To sum it up, operators could optimise impact of decommissioning by anywhere between 10% and 50%.
Government actions could include incentivising operators to further optimise decommissioning costs, maximising mature asset extraction to increase tax revenue and promoting reuse of assets/facilities. Government and companies will require a legal or commercial workaround to incentivise mature field and decommissioning operators (e.g. acquiring a legal entity, incumbents retaining a stake or adjusting the tax rules).
All stakeholders will need their own strategies but without collaboration at multiple levels, new solutions to technical, people and regulatory issues, they will bear increased costs, liabilities and leave behind a sub-optimal situation for years to come.
Justifying Appraisal in a Low Oil Price Environment: A Probabilistic Workflow for Development Planning and Value of Information. SPE 182410
Stuart Walters, Gavin Ward, Bruce Wigston and Shyam Talluri, Chevron Australia Pty Ltd
Appraisal adds value to potential developments by changing key development decisions (well count, subsea infrastructure requirements, development sequence etc.), and this value can be quantified using value of information (VOI). The value of perfect information is readily evaluated but, unfortunately, all real world data is imperfect. Quantifying the value of this imperfect information requires assessment of either (i) the likelihood of the appraisal activity correctly resolving the value of an uncertainty, or (ii) the impact of the activity on the post-appraisal uncertainty range, both of which can be problematic. Traditional value of imperfect information analyses tend to focus on resolving only a single uncertainty and becomes difficult to apply as the number of uncertainties addressed by a single appraisal activity increases.
This paper describes a fit-for-purpose probabilistic approach to enable the rapid evaluation of perfect and imperfect value of information for a range of appraisal alternatives. The workflow is demonstrated through its application to a recent deepwater appraisal well that included an extended well test selected as the preferred activity from amongst a range of alternatives (including conducting no further appraisal).
The workflow uses a Monte Carlo spreadsheet tool to generate gas in place (GIIP) and estimated ultimate recovery (EUR) estimates for individual reservoir elements, which are then aggregated to field level estimates. A large number of individual trial values are captured and interrogated in conjunction with a set of heuristics to allow the rapid generation of probabilistic development plans (without needing to rely on a small set of deterministic realisations). Distributions and dependencies defined in the spreadsheet can be readily altered, enabling robust evaluation of the impact on EUR and preferred development plan for each appraisal alternative and outcome (low/mid/high). The EUR and development plan are then used in an economic model to quantify the value added by each appraisal activity. The highest value appraisal activity, in this case the appraisal well with an extended well test, was executed and a post-appraisal lookback was completed to review the value of information analysis once the appraisal results were available.
Subsea Multiphase Flowmeter: Performance Tests in Multiphase Flow Loop. SPE-182378
Ö. Haldun Ünalmis, Vishal V. Raul, Vijay Ramakrishnan, Weatherford
Performance of a new three-phase (3-P) flow measurement system is presented using multiphase flow loop data. The system consists of two currently available products: an optics-based flowmeter and an infrared-absorption-based water-cut meter. This approach of combining two robust and field-proven technologies to determine the measurement capability and performance of the combined system under realistic flow conditions is demonstrated for the first time. The new 3-P flow measurement system represents a viable alternative for subsea multiphase flow measurement and can also be used on offshore platforms and onshore multizone applications.
The flowmeter system relies on three main measurements: bulk velocity and sound speed measured by the optics-based flowmeter, and water-cut measured by the water-cut meter. The velocity measurement is a robust measurement based on turbulent flow and is not affected by upstream flow conditions. The water-cut measurement is based on near-infrared absorption of water and oil molecules, and therefore, is immune to water salinity and the presence of gas (such as free gas, gas in solution, and oil foaming). Total flow rate is calculated using the bulk velocity measurement; the liquid holdup and density of the mixture are obtained by introducing the mixture sound speed and the water-cut measurements into a flow model.
The results of the multiphase flow loop test demonstrated that the new flow measurement system is capable of resolving total volumetric flow rates as well as phase volumetric flow rates in a broad gas-volume-fraction (GVF) band. Furthermore, mixture density can be successfully calculated from the flow model and, as a result, the mass flow rates can also be determined. The test data also confirm that the water-cut measurement is not affected by foaming issues and associated density variations. The test results are discussed in detail in the paper.
The new flow measurement system offers several advantages. The optics-based flowmeter that goes into the well provides flow measurement for the life of the well with no significant drift in signal. The flowmeter can be installed in any orientation and does not require recalibration. Its nonintrusive and fullbore features mean no permanent pressure loss, and high resilience to erosion and corrosion. The nonnuclear water-cut meter measures water cut in the broad GVF spectrum and is not affected by challenging flow conditions, such as slug flow.
A Case Study for Deriving and Calibrating Net Reservoir in Thinly Bedded Siliciclastic Formations: Brigadier Formation, Offshore Australia. SPE 182361
Paul Pillai, ISOS Petroleum, Brian Douglas, Chevron Australia, Hendrayadi Prabawa, Schlumberger
The Brigadier Formation is a thinly bedded reservoir that contains approximately 40% of in-place gas resource in Wheatstone field, which underpins the 2-train Wheatstone Liquified Natural Gas (LNG) project in Western Australia. The development drilling campaign has recently been completed with three Wheatstone development wells targeting the Brigadier formation in the northern part of the field. Accurate and timely determination of net reservoir thickness is crucial not only for evaluating field volumetrics and performance, but also to support time- sensitive drilling decisions. In the case of two of the Brigadier wells, the decision to accept a development well location or move to a contingent well location was required within 72 hours of penetrating the reservoir with a pilot hole. Additionally, TD (total depth) decisions in the Brigadier Formation were made on the basis of real time kH (cumulative permeability thickness) evaluation from Logging-While-Drilling (LWD) logs and are essential for balancing well deliverability requirements with minimising risk of early water breakthrough by optimising standoff from the aquifer.
A fit-for-purpose approach to calibrate and evaluate net reservoir in the thinly bedded Brigadier Formation will be discussed. Several methods of net reservoir determination have been tested in the Wheatstone field. Standard resolution methods like density-neutron cross-over, and photoelectric factor, and high resolution methods like electrical image logs (water-based Formation Micro-Imager (FMI) and oil-based New Generation Imager (NGI)) and LWD alpha processed density are compared against core-based sand counts to derive the most reliable and fit-for-purpose method of net reservoir determination. Mud-type, conveyance methods and borehole condition also impacts the results of net reservoir evaluation.
The results from a combined density-neutron-photo electric factor method was found to compare very well to core net reservoir and image log-derived net reservoir, across mud types, reservoir fluids and hole angles. It is locally calibrated and blind-tested successfully across different wells. The nuclear tools are logged in every well in the field so this method of calculating net reservoir could be applied consistently across all wells. An added advantage is that the evaluation of net reservoir is independent of porosity and hence, net reservoir will not need to change with different generations of petrophysical evaluation.