Is AI a governance blind spot for nonprofit boards?
I’ve spent a lot of years around board tables, across charities, nonprofits, state bodies and corporates. Trends come and go, reforms rise and stall, and every few years there’s a new “next big thing.” I can still vividly recall Y2K. But I’ll tell you this: what’s happening now with AI and digital systems is unlike anything I’ve seen before.
And the part that concerns me most? It’s not the pace of the change. It’s the silence around it.
Too many boards are playing the “let’s wait and see” game and sitting on the sideline. Not because they don’t care — but because they don’t realise it’s their business to be involved. The tech gets handed to the executive team and the board quietly steps back. I hear it all the time: “Isn’t that an operational thing?”
In my experience (and I say this as someone who has seen the consequences first-hand), when boards take their eye off the digital ball, it’s not long before risks go unnoticed, decisions go unscrutinised, and opportunities pass us by.
It’s more than just I.T.
I’m not an I.T. practitioner. I don’t code and I don’t speak in algorithms. But I do know governance. And I know that when a system is rapidly shaping how decisions are made, how resources are allocated, or how services are delivered — that’s a board matter.
We’re seeing tools now that automate everything from fundraising to risk assessments to frontline support. Some of it’s genuinely impressive, the pace of progress is stunning and almost impossible to keep pace with. An organisation I worked with recently introduced an AI chatbot to handle employee wellness questions. It operates 24/7 and relieves pressure on stretched personnel. Reasonable move, on the face of it you would think.
But here’s the rub: no one asked if the chatbot might stray into clinical advice, or how it would recognise when to escalate to a human professional. That’s not just a technical oversight — that’s a governance blind spot.
AI isn’t coming — it’s here
AI isn’t some distant innovation on the horizon. It’s already running in the background of systems you’re probably using. It’s processing grant applications and flagging financial irregularities. Quietly, without fuss, it’s changing how our organisations function.
Consider predictive tools that identify which service users are “most at risk” and prioritise support accordingly. In principle, great. But what happens when the data the AI relies on is biased or incomplete? Who checks that a vulnerable person hasn’t been overlooked because they didn’t “fit the pattern”?
These aren’t just technical questions. They go to the heart of board duties and values: fairness, inclusion, duty of care. And they deserve board-level attention.
The governance obligation is the same: understand the material risks and ensure oversight is real, not theoretical.
What boards can and should do
I don’t believe in scaremongering. But I do believe in preparedness. It’s time to treat AI with the same seriousness we apply to finance, safeguarding and legal compliance. Here is what I suggest:
1. Put it on the agenda
If AI isn’t a regular topic at board meetings, it needs to be. AI and digital risk should be a standing item with clear objectives: how tools are used, what decisions they influence, and where risks and opportunities are changing.
2. Get familiar, not expert
Directors and trustees don’t need to become AI and digital experts, but they do need a working understanding. Ask for a plain‑English briefing on where AI shows up in your organisation, what data it uses, the intended benefits, known risks, and mitigation plans. Even a basic briefing can make a world of difference.
3. Apply the values lens
Every decision about technology should be filtered through your mission and values. Does this tool serve our purpose? Does it help us reach people more effectively and fairly?
4. Insist on ethics
One board I advised recently adopted a simple “digital ethics checklist.” It covers questions around data use, consent, transparency and accountability. It’s not high-tech — but it’s high impact and we are working with a lot of our clients to embed them.
5. Keep reviewing
Technology is evolving fast, so fast. What was safe and useful last year might not be fit for purpose today. Make sure there’s a system for monitoring and reviewing digital tools — and that the board sees those reviews.
Governance expectations across regions
The specific frameworks vary, but the duties are consistent: oversee risk, ensure accountability, protect stakeholders, and steward public trust.
- In Ireland, Charities Governance Code expects boards to manage risk and oversee systems effectively, digital included.
- In the UK, The Charity Commission’s CC3: The Essential Trustee places ultimate responsibility for governance, risk and compliance with trustees.
- In the U.S., expectations show up through IRS Form 990 governance disclosures, state Attorneys General oversight of charities, and sector guidance like Independent Sector’s Principles for Good Governance and Ethical Practice.
Wherever you are based, your board can also draw on widely adopted risk and assurance frameworks to structure oversight, such as the NIST AI Risk Management Framework and ISO guidance on governance (e.g., ISO 37004), alongside national cyber guidance (e.g., NCSC in the UK). You don’t need to implement them wholesale; use them to inform your questions, checklists and assurance model.
AI is a board issue
We’re at a turning point. AI isn’t tomorrow’s innovation, it’s today’s operating reality. The signs are plain to see, the tools are woven into our work, and the risks, while subtle at first, are becoming visible.
If there’s one phrase I hope to hear less often in board evaluations, it’s “No one told us this was a board issue.” Because by now, we have been told.
This is not the time for boards to stand back. It’s a moment for board members to step forward, not as technical experts, but as stewards of purpose, integrity and public trust. And, frankly speaking as the Chair of a Charity Board, I think that’s what the role of trustee has always been about.
Patrick Downes, Managing Partner at Lionheart Corporate Governance, is an internationally recognised governance expert with over 25 years of experience advising boards and executive leadership teams across Europe and North America. His extensive career spans public, private, and non-profit sectors, where he has led high-profile board evaluations, governance reforms, and strategic advisory projects. A published author his expertise in risk management, ESG and regulatory compliance is widely acknowledged, and he holds a number of fellowships including of the Chartered Institute of Marketing, the Royal Society and the Institute of Management Consultants and Advisors. His work is centred on improving board performance, strengthening risk oversight, and building resilient governance frameworks that address the complexities of today’s regulatory landscapes.