Health NZ staff told to stop using ChatGPT to write clinical notes

🔥 Read this insightful post from Hacker News 📖

📂 **Category**:

✅ **What You’ll Learn**:

The image shows three hospital beds flanked by drawn curtains. There is a light green wash over the image.

Photo: RNZ

Health NZ (HNZ) says staff have been caught using free AI tools like ChatGPT and Gemini to write clinical notes, a move it says could result in formal disciplinary action.

A memo seen by RNZ was sent this week from a senior manager to all Mental Health and Addiction Services staff in the Rotorua Lakes district, reminding them not to use tools like ChatGPT, Claude or Gemini in their work.

“It has come to my attention that there has been instances where it appears that AI (artificial intelligence) drafting tools have been used to prepare clinical notes,” it says.

“The use of free AI tools (e.g. ChatGPT, Claude, Gemini) for clinical purposes is strictly prohibited due to data security, privacy and accountability concerns. You are also not allowed to use AI tools to draft notes and then transcribing it to handwritten or typed notes, even if you anonymise the patient information.”

Doing so could result in formal disciplinary action, it said.

According to the HNZ-wide AI policy, any AI tools must be registered with the Health NZ National Artificial Intelligence and Algorithm Expert

Advisory Group (NAIAEAG) – this would include Heidi, an AI scribe tool being rolled out across EDs.

Sonny Taite, HNZ director of digital innovation and AI, said free AI tools presented risks to data security, privacy and accountability, and “any possible exemptions are assessed case by case”.

“As with any new process in healthcare, we are working with our clinicians on new ways of working and this is an ongoing process.”

HNZ did not answer questions about how many instances there had been of staff using unapproved AI software, or whether anyone had been disciplined.

Staff turning to AI tools under ‘enormous pressure’ – union

Fleur Fitzsimons, national secretary for the Public Service Association, which represents many health and addiction service workers, said clinical staff were turning to AI tools because of the “enormous pressure” they were under.

A memo which opened by threatening formal disciplinary action was the wrong approach, she said.

“It’s a warning shot that will make staff afraid to ask questions or seek help.”

HNZ should be investing in proper training and approved tools, she said.

“Let’s not forget that HNZ has been cutting the very teams responsible for digital systems and IT support. If staff are improvising with free tools, HNZ needs to examine why that is the case, not simply threatening staff with a breach of the Code of Conduct.”

Sign up for Ngā Pitopito Kōrero, a daily newsletter curated by our editors and delivered straight to your inbox every weekday.

⚡ **What’s your take?**
Share your thoughts in the comments below!

#️⃣ **#Health #staff #told #stop #ChatGPT #write #clinical #notes**

🕒 **Posted on**: 1774474160

🌟 **Want more?** Click here for more info! 🌟

By

Leave a Reply

Your email address will not be published. Required fields are marked *