Non-operated JVs

Jan. 2, 2018
Does your asset team have the right profile?

James Bamford, Joshua Kwicinski, and Martin Mogstad, Water Street Partners, Washington, D.C.

How non-operating partners in joint ventures (JVs) structure their asset teams – size, composition, reporting, location, and other factors – will have an enormous bearing on the company's ability to influence the Operator. In our experience, high performing non-operated asset teams can easily have a 5%-20% impact on asset performance and serve as a foundational backstop to material risks. And since non-operated ventures routinely represent 10%-50% of company production, revenues, or capital investment volumes across petroleum, mining, power, and other natural resource companies, opportunities for impact abound.

Unfortunately, an analysis shows few companies have a well-defined or consistent way to think about designing their non-operated asset teams. As a result, companies don't position themselves to optimize influence in this critical asset class. At the same time, asset teams are less able to adequately defend themselves against corporate pressures – such as asset team headcount reductions, or re-organizing asset-level functions into centralized corporate groups – which often further undermine influence.

As such, companies need a better way to think about non-operated asset team structure and performance. Based on years of experience assisting hundreds of asset teams, we've developed Standards for Non-Operated Asset Management Excellence, designed to help companies assess their JV structures, practices, and contractual terms – and to compare themselves to a set of objective standards. In doing so, companies can diminish the gap-to-potential and pinpoint improvement areas. Specifically, the broader assessment framework includes 15 Standards related to how an individual company organizes itself internally to manage a non-operated venture (Table 1). Each Standard includes specific tests to evaluate excellence, which then can be translated into a score, and benchmarked relative to other non-operated asset teams.


Of the 15 Standards, one relates to the profile of the non-operated team – its size, functional mix, seniority, time dedication, location, reporting relationships, etc. Team profile can make all the difference in the effectiveness of a non-operator. A small, technically-oriented team with relatively senior resources and strong communication and influencing skills usually outperforms a team that is twice as large, but half as capable and spread across locations.

When we test non-operated asset teams against this Standard, we look at eight different components to reveal the full picture (Table 2). In the most general sense, these components each answer a different question:

• Size – Do the number of FTEs (Full Time Equivalents) allocated to the asset reflect its current value and risk profile, and the company's ability to influence – or are they a product of historical needs or other less relevant factors?

• Functional mix – Given the Operator's relative capabilities, the profile of upcoming projects in the asset, and the company's own strengths, does the asset team have the right mix of functional support to do things that would generate influence and add value?

• Seniority – Does the team's seniority profile match up with the Operator's team profile, our influencing objectives, and the technical complexity of the asset (e.g., if the Operator is senior-heavy, are we sending junior people to try and influence them)?

• Time dedication – Do members of the asset team dedicate enough time to do their work well, to understand the bigger picture surrounding the asset, and to feel real accountability as part of a team – or is everyone involved while nobody is accountable?

• Continuity – Do the members of the asset team stay involved with the asset long enough to nurture effective relationships with peers in the Operator, and understand the deeper nuances of the asset – or do they rotate so often that the Operator feels whiplash, and nobody has a holistic picture of the asset?

• Location – Are team members located close enough to collaborate consistently and effectively as a team? Are they also located close enough to the Operator to spend real time there, rather than trying to influence from another continent via scratchy video conferencing?

• Orientation – Across the year, do key asset team members (e.g., asset manager, technical leads) actually spend real time engaging with the Operator, and not purely focused on internal meetings and information reporting?

• Reporting relationship – Does the asset manager actually have a team reporting to them? Is the asset manager fully accountable for setting goals and objectives for team members who spend more than half their time on the asset, even if they belong to another function?

When applied against our Standards, the answers to these questions can be surprising. One mining major found it difficult to exert any influence in Operating Committee Meetings (OCMs) as a non-operator. The cause? It tended to appoint OCM representatives from junior commercial staff, while the Operator and other non-operators sent senior experts capable of complex technical discussions. Or consider an IOC where teams generally spent no more than 10% of their time actually working with the Operator's team, which is below the industry benchmark and far below the 25% minimum that we test for. This IOC had a voracious set of internal meetings and reporting requirements that sucked the life out of the team, leaving little time for impactful engagement and influencing of the Operator.

An evaluation of these eight components is a powerful way to prompt discussions about the health and needs of an existing asset team, say, during an annual strategic review of the asset. Or it might be used to steer the design of an asset team when structuring a new non-operated asset from scratch. Ideally, this evaluation is more art than science, taking into account not only our Standards, but also the nature of the Operator, the legal agreement, and the broader context of the asset.

Creating a highly effective non-operated asset team requires turning the dials in a number of dimensions, making sure that resources allocated to the asset team are put in a position to succeed. While nothing is guaranteed, asset teams staffed by people whose expertise matches the technical opportunities of the asset have a much better chance of being world-class non-operators.

About the authors

James Bamford is a co-founder of Water Street Partners based in Washington, DC, where he serves a global client base across industries on joint venture issues. He has supported more than 200 JV transactions and restructurings during his career and has worked extensively on JV governance, organizational, and commercial matters. Prior to Water Street, he co-led the JV practice at McKinsey & Company.

Joshua Kwicinski is a senior director at Water Street Partners has led multiple engagements on JV governance with IOCs, NOCs, and other natural resources firms. He also serves a global client base in aerospace and defense, biotechnology, hi-tech manufacturing, and other industries.

Martin Mogstad is a senior consultant with Water Street Partners, and has led multiple engagements globally within the oil and gas industry on JV portfolio management, non-operated asset team excellence, and JV governance. He joined Water Street Partners from Schlumberger.

© Jakub Jirsák | Dreamstime

T1: Assessment Against Standards of Excellence

Overall non-operated JV score


Individual standards

People and accountabilities



Accountable executive



Asset manager



Asset team profile



JV management team profile



Training and development



Individual objectives and incentives


Risk management and assurances


Audit, assurance, and compliance protocols



Independent risk assessment


Strategy and plans


Independent opportunity assessment and strategy



Annual influencing plan and campaign



Annual strategic review


Partner management


Holistic partner management



Partner communication



Rights and obligation management



Information protocols


How scoring works

Per standard


Each standard comprised of sub-components that are evaluated and scored against expectations

c Strong (7.5-10)

■ Fair (5.0-7.4)

■ Weak (2.5-4.9)

■ Critical (0.0-2.4)

Weighted sum of sub-components within each Standard equals up to 10 points

No penalty if unable to score a sub-component, or whole Standard deemed not to apply to the JV

T2: Standard Assessment – Asset Team Profile

Standard 3 – asset team profile



• Limited reservoir engineering team, despite Operator gap and overall company strength in area

• Functional mix not re-considered since project shifted from development phase to mature operations

People and accountabilities



3.1 Size


3.2 Functional mix


3.3 Seniority


3.4 Time dedication



• Team spends median 25% of time on Operator engagement (i.e. meeting formally or informally with Operator team members)

• Notable effort to limit internal busywork and increase engagement with technical committee members

3.5 Continuity


3.6 Location


3.7 Orientation


3.8 Reporting relationship


How scoring works

Component score

Fully meets (1.00) ■

Somewhat meets (0.33) ■

Mostly meets (0.67) ■

Does not meet (0.00) ■

Source: Water Street Partners