Published on the 09/10/2025 | Written by Heather Wright

Two sides of AI in the boardroom…
Tell your board where, and how, AI is being used in your organisation, stat – it’ll enable them to sleep easier at night and ensure they get governance right.
That’s the message from Susan Cuthbert, the Institute of Directors NZ’s principal advisor, governance leadership, who says directors who don’t know where AI is being used in their organisation should be worried.
“That’s valuable information. It will help your board sleep easier at night.”
Cuthbert’s comments come as local organisations appear to be adopting AI at pace – but both local and global figures suggest many boards don’t have visibility into how it’s being deployed, creating a governance blind spot.
A PwC report showed only a third of US directors surveyed reported their boards had incorporated AI and genAI into oversight roles.
Locally, Kiwi reports show 82-87 percent of local businesses are now using AI tools. (In Australia, the Department of Industry, Science and Resources says AI adoption varies dramatically depending on company size, with larger organisations leading adoption (82 percent of organisations with 200-500 employees were using AI) and figures dropping as business size reduced. Just 40 percent of those with 5-19 employees were using AI.)
But Cuthbert says a poll from an Institute of Directors conference in September saw just 40 percent of respondents saying staff were experimenting with AI tools – suggesting many boards aren’t fully aware of the AI use in their organisations. Around 25 percent of those said they had guardrails in place.
“We know there’s a whole lot of shadow activity going on and that raises a lot of potential risk.”
Cuthbert says boards must move quickly to understand where AI is being used, what data it relies on and who is accountable – and it’s up to the IT teams to provide that information, dissect the issues so the board can gain a better understanding of the areas they need to focus on and ensure information is communicated at the right level.
“That’s valuable information. It will help them sleep easier at night.”
Unlike other technologies such as cloud, AI presents a different governance challenge. It’s fast-moving, disruptive and capable of reshaping markets, cultures and decision-making, and has ‘many’ risks associated with it, she says.
“It is different from other tech, but at the same time the principles to govern AI are generally in similar areas to oversight of other risks.”
The Institute of Directors has developed guidance material which includes nine principles for governing AI. It starts with boards taking action from a strategic perspective, recognising that AI shouldn’t be bought by management without the overall strategy first being considered.
“The whole issue of AI has to be brought up to the board table and considered strategically. Then it’s also a question of boards balancing understanding the opportunities that exist within the organisation but also being really real about risks and addressing those.”
Crucially for IT teams, boards need to understand what data they hold, its quality and how it is used in AI systems, and set clear reporting expectations for reporting which provides quality data and feedback to the board so they can make decisions on an ongoing basis.
“This isn’t something you just set up and leave. It requires constant vigilance and teasing out the risks that sit within the organisation in different areas, and they will develop as algorithms develop.”
Finally, boards must maintain stakeholder trust by ensuring AI outputs are reliable and ethical , and ensure changes support a positive workplace environment.
“The board’s role is to make sure that it’s really clear what their position is, what their risk appetite is, what their strategic direction is and how AI fits into that.”
While early AI governance focuses on risk mitigation, boards are also beginning to explore value creation. However Cuthbert cautions the journey must start with safety.
“The first thing an organisation needs to do is address the guardrails. Then they can certainly – possibly even at the same time, depending on the nature of the organisation – move into exploring opportunities.”
She urges CIOs to meet boards where they are on the maturity curve.
“Boards are on a journey. Support that by dissecting the issues and providing structure.”
Above all, communicate clearly. “Make sure the board knows where AI is being used and how,” she says, noting that’s the starting point for good governance.
Smarter meetings
Boards too, are embracing AI for their own use, though Cuthbert stresses that is a very separate area from the governance discussion.
“If you talk to a board about using AI, you have to be really clear about whether you’re talking about the governance role and getting oversight across the organisation, or about the board itself using AI,” she warns IT leaders.
CIOs and IT leaders should help boards understand both areas, but ensure they tailor their communication accordingly.
Boards are starting to use AI themselves, starting small with helping with meeting preparation and minute taking.
But Cuthbert cautions even the seemingly simple application of AI for minute taking comes with risks.
“Doing quality minutes is actually quite a skill to tease out what the real issue is and what the real decision was from a complex conversation, particularly over [several] meetings. You can’t just throw AI in there and expect it to be very accurate.
Those minutes can potentially end up in court, so having one source of truth is critical, and recordings must be disposed of.
Confidentiality is another concern. “You can’t just upload your board pack to an LLM. That would be horrifying.”
Despite that, Cuthbert sees clear potential for directors using AI to summarise reports, explore scenarios and even improve their own governance practices.
“Can AI help boards make better decisions? Have better conversations? That’s exciting. If we get better directors and better governance, that’s good for New Zealand organisations,” she says.
For those presenting to boards, Cuthbert says there is ‘really huge’ potential in terms of improving the quality of all papers.
“And I think management are generally on to this and there’s probably wide use of AI to do that. But it’s not just a matter of pushing a button and getting a quality paper out the other end. It’s important to have that human in the loop and maintain critical thinking, making sure the right information goes to the board,” she says.
Of board use of AI, she says: “It’s about having better human conversations. How can we be more human around the board table and utilise AI to help us do that?”
Just don’t count on AI sitting at the board table as any kind of decision maker any time soon.
“I think that’s quite a way off and quite futuristic!”