The future of surgical AI


Friday, 24 May, 2024


The future of surgical AI

From enhancing diagnostics to transforming surgical planning and treatment, artificial intelligence (AI) is playing an increasingly important role in health care as services and providers around the world strive to use technology to improve patient outcomes and reduce time spent by health professionals on operational and administrative tasks.

The global AI in healthcare market is estimated to reach US$188bn by 2030, according to data gathering and visualisation firm Statista.

Closer to home, research firm insights10 predicts the Australian AI in healthcare market to grow from $0.08bn in 2022 to $1.78bn by 2030, registering a CAGR of 46.72% during the forecast period of 2022–2030.

If the University of Adelaide researchers are to be believed, Australia and New Zealand could become international leaders in the safe use of artificial intelligence (AI) in surgery. “But first there needs to be guidelines in place to safeguard patients”, the researchers warned. In a recent report published in the Medical Journal of Australia, the researchers highlight opportunities, challenges and recommendations for AI in surgical services.

The Adelaide Score

“There is no doubt that AI has the potential to change surgical services for the better, improving diagnostic accuracy and efficiency. The Adelaide Score algorithm is the perfect example of this as we have shown that it can successfully predict discharge within a 12- to 24-hour period, potentially helping to improve patient management in hospitals,” said first author of the study and University of Adelaide researcher Dr Joshua Kovoor from the Adelaide Medical School.

The Adelaide Score study included general surgery patients at two tertiary hospitals over a two-year period. The tool could be useful for both treating teams and allied health staff within surgical systems, according to the researchers.

While the technology offers many benefits, there are also some limitations and “it should in no way replace hospital staff. It should always be used as an assistive tool, and its implementation needs to be carefully regulated,” Kovoor said.

Emerging applications

Dr Chris Varghese from Waipapa Taumata Rau, University of Auckland, New Zealand, echoed similar thoughts about the opportunities of AI in surgical services.

The technology has potential in improving every aspect of surgical care — before, during and after treatment, said Varghese and colleagues in an article published in Nature Medicine.

“Each time that we leave hospital, we are at increased risk of having complications from surgery,” Varghese said.

“AI has got a real potential to provide monitoring and safety-netting to ensure that we can mitigate and prevent some of these complications and enhance the recovery that you’re able to achieve at home.”

Current AI applications in surgery have been mostly limited to unimodal deep learning, with emerging applications including computer vision, training, diagnostics and postoperative monitoring amongst others.

“AI is trying to learn what surgeons see, what the surgical instruments look like, what different organs look like. And the potential there is to identify abnormal anatomy and [determine] what the safest approach to an operation might be.

“Using virtual reality and augmented reality to plan ahead of surgeries can be really useful for cutting out cancers and more.”

An evidence-based approach

Senior author Professor Guy Maddern, the R.P Jepson Professor of Surgery at the University of Adelaide and a hepatobiliary surgeon at the Queen Elizabeth Hospital, emphasised the need for a strict evidence-based approach that reflects international frameworks as well as local factors.

“AI also presents many challenges. Surgeons may have difficulty having confidence in AI-assisted recommendations due to the lack of reasoning behind the decisions,” Maddern said.

AI systems are frequently described as ‘black boxes’ due to the absence of reproducible reasoning underpinning their decision-making processes. This lack of transparency may make it difficult for surgeons to have confidence in AI-assisted recommendations, particularly in circumstances where there are major differences between specialist surgeon opinion and AI.

Surgical staff across Australia and New Zealand must become familiar with interpreting and transparently communicating the inputs and outputs of AI tools, the authors suggest. Staff should encourage patients to ask questions and express concerns, and provide information to aid patient-friendly explanations, they added.

“When using AI tools, surgical staff should comprehensively document their rationale for all clinical decision-making, particularly any deviations from AI recommendations. Similarly, it is imperative that surgical staff comprehend the ethical implications for patient privacy and confidentiality when AI is integrated within their service. As clinical AI systems are likely to require sensitive patient data, surgical staff must adequately inform patients, including risks such as misuse or data breaches.”

Privacy and ethics

As with most new technologies, with opportunities come limitations — especially issues related to data handling, privacy and ethics.

“AI is based on building models from lots and lots of data and ensuring that the data we feed into these algorithms are unbiased and are not perpetuating existing inequities in our datasets and our research is essential,” Varghese said.

So, it’s important to ensure that what is fed into the models and how they are trained is robust and achieving the best outcomes for our patients, Varghese noted.

“Surgeons may treat a specific patient population due to the location of their institution or specialised professional interests, and algorithms trained on datasets derived at a population level may perform suboptimally at a local level. Surgical services should be aware of potential biases in algorithms and limitations of training data and regularly audit AI-driven systems after local deployment,” wrote the authors of the research led by the University of Adelaide.

“Current malpractice guidelines will also need to be revised to reflect the use of AI, along with policies around the handling of sensitive patient data.”

The team led by the University of Adelaide recommends the development of infrastructure to monitor and audit AI tools to ensure they are benefiting both the patients and the system. Patients and surgical staff also need to be educated on the benefits and limitations of this technology, the researchers said. Below are their key recommendations:

  • Understand and consider current opinions on AI held by the non-surgical healthcare communities and the wider society of Australia and New Zealand through serial evaluation.
  • Maintain a strict evidence-based approach when developing and implementing AI tools within Australian and New Zealand surgical services that is adherent to internationally recognised frameworks but also considers local factors, regardless of the aspect of surgical care, and ensures risks and benefits to patients and systems are rigorously evaluated.
  • Develop necessary infrastructure for strict post-implementation monitoring and audit of AI tools to ensure ongoing patient and system benefit, in alignment with the principles of the Royal Australasian College of Surgeons.
  • Ensure close and ongoing engagement with the regulatory bodies and laws of Australia and New Zealand
  • Be aware of ethical risks associated with AI and take approaches that address these risks when implementing AI tools, such as ensuring data security.
  • Efforts should be made to educate Australian and New Zealand surgical patients and staff on the use of AI, including its benefits and limitations.
  • Produce guidelines specifically relating to the use of AI by surgical services within Australia and New Zealand.
  • Promote broad collaboration between the surgical services of Australia and New Zealand to ensure safe AI use at a national scale.

Image credit: iStock.com/whyframestudio

Related Articles

Concept to clinical care: what's holding back healthtech?

Australia is globally recognised for its exceptional medical research output. So why isn't...

Why more needs to be done to support home-grown innovations

Commercialising new medical devices or drugs is highly risky, extremely expensive and returns can...

Opinion: Securing the backbone of health care

Unified, reliable databases provide healthcare organisations with immediate access to...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd