When a specialist practice falls behind because referral paperwork is incomplete, or a pathology result sits in a queue while a clinician waits to act, nobody wins – least of all the patient. AI can help solve these problems. But people rightly want to know: who’s making sure it’s done safely?
On 1 December 2025, the Medical Software Industry Association (MSIA) and the Medical Technology Association of Australia (MTAA) launched Australia’s first Voluntary Code of Conduct for the use of AI in health software. Magentus, whose clinical systems and practice management software are used by thousands of medical professionals across Australia, was the first company to sign.
Magentus CTO Brenden Conolly explained: “We signed the Code because patients, clinicians, and governments need to know that AI is being used carefully and transparently. This is about committing to exactly that, and earning their trust.”
What the Code covers
The Code is a principles-based framework for any organisation that develops, deploys or supplies AI in a healthcare setting. It covers ten areas including accountability, risk management, data governance, testing and monitoring, transparency, and the right of individuals to challenge AI-driven decisions.
Signatories commit to appointing clear ownership of AI within their organisation. They conduct clinical validation before deployment, maintain ongoing monitoring afterwards, protect patient data under privacy law, and ensure end users can access meaningful information about how AI systems work.
The Code aligns with the Australian Government’s National AI Plan, announced on 2 December 2025, and is accompanied by a companion Accreditation Standard.
“We see responsible AI use as increasingly integral to providing safe, modern healthcare, and we want to set the benchmark from the beginning,” Conolly said.
What this means in practice
When we talk about AI in healthcare, it’s important to be clear about precisely what the technology is and isn’t doing. It doesn’t make decisions. It doesn’t replace clinicians. What it does do is handle repetitive, time-consuming work that pulls clinicians away from patients.
Take a specialist practice running Gentu, Magentus’s cloud-based practice management platform. AI has the potential to convert consult conversations into patient note summaries, flag an incomplete referral before it causes a delay, catch missing claim information that would hold up billing, or help staff anticipate busy periods so they can plan ahead and keep patient care running smoothly. These aren’t clinical decisions, they’re routine administrative tasks performed more efficiently and reliably, so patients are seen on time and with full attention.
Beyond practice management, Magentus’s clinical systems support pathology networks processing tens of thousands of tests daily, oncology teams coordinating chemotherapy across multiple hospitals, and radiology departments managing growing diagnostic workloads. In each setting, well-governed AI can reduce duplicate testing, detect data mismatches, highlight unusual patterns worth a second look, and get the right information to the right clinician faster.
The result: shorter waits, fewer errors, and more time spent with patients rather than screens.
“AI isn’t about replacing expertise, it’s about removing the noise around it. A well-designed system gives clinicians back time for direct patient care,” Conolly said.
Clinicians stay in charge
Under the Code – and under Magentus’s own governance – clinicians remain in control. All AI outputs must be explainable and transparent. Every AI feature is tested, validated and monitored in real-world conditions. Patient privacy is non-negotiable.
Magentus works closely with government agencies, health networks, and medical practices to ensure AI tools reflect real clinical needs. The MSIA is now running webinars with these stakeholders to build engagement around the Code, and the Australian Government has signalled strong support for this kind of industry-led governance.
What comes next
The Code is version one, with a review due mid-2026. It doesn’t cover medical devices, which remain regulated by the TGA. But for the growing category of health software outside TGA oversight, in areas like practice management, health informatics, and clinical workflow tools, it creates a baseline that didn’t previously exist. It’s a concrete, public, and verifiable starting point and Magentus remains committed to staying the course as the technology and its touchpoints mature.
Michele Blanshard, Magentus Managing Director of Practice Management & Oncology and MSIA Vice-President, said the opportunities AI provides healthcare are exciting, but the industry needs to ensure every step is taken to ensure sound governance and transparency.
“Good healthcare relies on trust. We’re committed to building AI that protects that trust and helps deliver the best possible outcomes for every patient,” Blanshard said.