Tech rules: the importance of having an AI policy

31 Mar 25

Organisations need a policy for how AI gets used by staff – and it doesn’t take much specialist knowledge to formulate one.

ai usage. credit_shutterstock_2444243061

 

Following an article he wrote in Public Money & Management, PF speaks to CIPFA’s head of technical and standards, David Lyford-Tilley, about setting parameters for AI use.

Q Is AI a threat or an opportunity?

Definitely a mixture of both. People use ‘AI’ to mean lots of different things. It is, to some extent, a marketing term for technology, some of which has been around for ages.

It has a huge ability to do things that were previously believed impossible to automate, so AI tools can offer new efficiencies. But these tools have strengths and weaknesses, and those aren’t always advertised.

Q What kinds of pitfalls do people need to be aware of?

Much of what people get in trouble with involves improper use, or trying to use AI for things that it’s not good at.

Generative AI tools are really very good at generating realistic, human-like text in response to prompts, but we have to be conscious of how they have been trained. Most of the ones that are available for free on the internet have been trained on things that are freely available on the internet.

AI might be good for summarising minutes from a meeting, for example, but it’s not necessarily going to have specialist technical knowledge if you want it to do something complicated. It probably won’t give you the best advice on an accounting treatment, and might suggest you follow rules that expired two years ago, for example.

For those process questions, I wouldn’t expect a great answer from a publicly available tool. You might have expensive or in-house AI tools that could cope, but it’s important to understand the difference.

 

AI-apps. CREDIT_shutterstock_2327096251

 

Q How do you view AI?

I sometimes liken it to a junior colleague: think of it as quick, excitable and keen to get stuff done, but also be aware that it won’t have the understanding or the context that you’d expect from an experienced colleague.

You’ll definitely be wanting to check its work, because it won’t be mistake-free. It has the inputs, but maybe not a lot beyond that, whereas you have the context necessary to correct things if needed. It’s important to still have a human in the loop.

Q Why do organisations need to set a policy on AI use?

Having a policy sets ground rules for what people should be thinking about when using these tools. The policy is itself a tool for communicating some of the strengths and weaknesses that I’ve been talking about.

If you don’t have a policy, then, by default, your policy is ‘do whatever you like’. These tools are very popular, and people know they are out there, so they are going to try them out. Having something that covers the basics is really important.

Q Do you need to be an AI expert to do this?

It doesn’t have to be massively complicated. CIPFA’s own policy is being finalised, and has about three pages of content.

It’s having the discussion that matters, and that doesn’t require specialism or technical knowledge so much as thinking about things and communicating that to the organisation.


Public Money & Management is an academic and professional journal covering public sector finance, policy and management. It is free to CIPFA members

Image credit | Shutterstock
  • David Lyford-Tilley

    David Lyford-Tilley is CIPFA’s head of technical and standards

Did you enjoy this article?

AddToAny

Top