In every council chamber and health board boardroom across Wales, the pressure is mounting. From the Senedd to Westminster, we are being bombarded with AI plans, steering groups, and pilots.

The narrative is seductive: AI is a “magic box” that will unilaterally solve our efficiency woes and bridge the ever-widening gap between rising demand and shrinking budgets.

But we’ve seen this before. In 2010, the “magic box” was the Cloud. In 2015, it was Big Data. Each time, this hype cycle promises a silver bullet, and each time, we’ve been left wondering why the promised savings haven’t materialised.

The hard truth for Welsh public sector leaders is that you will not save significant time or money simply by “turning on” Microsoft Co-pilot or procuring a shiny new AI tool.

As we said in our report, what’s needed isn’t just better tools, it’s better ways of working. We need to stop looking for magic solutions.

We’ve outlined five practical actions that every public sector leader in Wales should be taking this year to move past the noise and deliver actual value for the people of Wales.

1. Demystify the “magic box”

The first thing we must do is demystify the terminology. AI is not a single, revolutionary technology; it’s a toolbox. Some tools, like Machine Learning, have been quietly maturing for years, helping NHS clinicians spot cancers earlier and speed up diagnoses. Others, like Generative AI and “Agents,” are newer and bring different risks.

As a leader, your job isn’t to be an expert, but you must be able to distinguish between hype and utility. If an internal team or a vendor tells you a tool will “solve” a complex policy issue, ask them which tool in the box they are using and why. If they can’t explain the mechanics, they’re selling you magic, not a solution.

2. Invest in training

To navigate this complex landscape, we must invest in our people. This does not mean generic, tick-box “AI skills training”. Low-value courses that exist to show activity, rather than build understanding, change little and deliver even less.

Invest in proper, practical, hands-on training for your leadership teams and your staff.

Leaders need enough hands-on fluency to understand how these tools behave in practice: how to ask good prompts, how to test and verify AI-generated information, and where the limits and risks lie.

If an executive team is not confident using these tools themselves, they cannot credibly govern their use, set sensible boundaries, or lead responsible adoption.

3. Identify real pain points and inefficiencies

AI is most effective when it is applied to specific, well-understood friction points.

It should not be treated as a solution in search of a problem.

Start with the people you serve and the constraints they face. The first question must be “what problem are we trying to solve?” — not “where could we use AI?”

We need to hire user researchers and service designers to map end-to-end service journeys to understand real needs and goals. Their job is to find the “rot” in current processes - the repetition, the unnecessary handoffs, the “failure demand” (the demand placed on a service when we fail to do something right the first time for a citizen).

This is where we can lean on the concept of “Boring Tiny Tools.” We often waste time trying to build massive national platforms when what we actually need are small, automated solutions for repeatable manual tasks.

Once you know where the pain is, you can decide if AI is the right tool to fix it. As we’ve said before: starting with people, not solutions, is how we design services that actually work.

4. Build safe places to experiment

The fear of “getting it wrong” in the public sector often leads to paralysis. To counter this, you must build safe places to experiment.

The best way to understand technology is to get your hands dirty in a controlled environment. These tools carry risks, and potential unintended consequences. We need safe places to experiment.

As a leader, your role is to lower the barriers to entry by:

  • Creating technical “sandboxes” where data is protected but experimentation is encouraged
  • Making it easy and fast to buy a small number of licences to test new tools iteratively
  • Adopting a “test, learn, and adapt” approach that runs small experiments with rapid feedback loops
  • Building internal technical capability that can connect tools together to generate those Boring Tiny Tools
  • Celebrating when it’s necessary to stop something because it’s not working anymore, like iAI has done with its Redbox tool for civil servants

This reduces costs and uncertainty while quickly releasing value to the public.

5. Work in the open

Finally, we must resist the urge to build in silos. The challenges facing a council in Gwynedd are likely very similar to those in Monmouthshire. What Cwm Taf Morganwg University Health Board is learning is probably useful to Betsi Cadwaladr University Health Board.

Working in the open - sharing regular updates, early prototypes, and being honest about what isn’t working - is essential. When we share our experiences, thinking and processes, we raise the collective bar for the entire Welsh public sector. It prevents us from making the same expensive mistakes twice and allows us to scale successes across the nation.

AI is not a miracle cure. It is a set of tools that, when applied to well-understood problems by trained people, can do remarkable things.

By focusing on these five actions, we can move beyond the “empty promise” of quick fixes and start building the modern, human-centred public services that Wales deserves.