AI.Care 2023 kicks off with focus on global standards for innovation
The Australasian Institute of Digital Health’s (AIDH’s) AI.Care 2023 conference has kicked off in Melbourne with German physician Dr Eva Weicken delivering the opening plenary on setting global standards for health AI innovation.
Weicken is involved in the new WHO-ITU-WIPO Global Initiative on AI for Health (GI-AI4H) — an international initiative to promote the global adoption of standardised artificial intelligence solutions in health care, facilitate safe and appropriate use of AI, and ensure it fulfils its potential to support diagnosis and treatment.
Global adoption of standardised AI
“The initiative will ensure that AI solutions for health meet certain requirements, which can assist policymakers and regulators,” she said.
“It is our hope that these standards will also contribute to the democratisation of health care; providing new resources to regions that lack access to health care.”
The GI-AI4H, spearheaded by the United Nations organisations, the International Telecommunication Union (ITU), the World Health Organization (WHO) and the World Intellectual Property Organization (WIPO), was announced at the AI for Good Global Summit in July 2023. It builds on the momentum created by the ITU-WHO Focus Group on AI for Health (FG-AI4H).
Experts from academia, research, AI development, ethics, regulation, policymakers and clinicians are involved in the global initiative and include Australian Professor Sandeep Reddy FAIDH, who has written a textbook on AI in health care.
Weicken is co-chair of operations and co-chair of its Clinical Evaluation of AI for Health working group. This group developed a Clinical Evaluation of AI for Health framework, which gives guidance for best-practice evaluation of AI technologies in health, including a checklist for clinical evaluation of AI systems.
Clinical evaluation
“Clinical evaluation is fundamental to the safe and effective use of AI health technologies,” Weicken said. “It enables clinicians, patients, regulators and other stakeholders to have the evidence they need to assess the safety, effectiveness and likely value of the technology and its performance in their setting.
“From a clinical perspective, it’s very important these tools are safe, effective, fair and useful. If you want to use AI it must be certain that it will cause no harm. AI is like any other medical intervention, and it is crucial to weigh its benefits versus risks. Clinical evaluation is the way to effectively demonstrate the intended benefits.”
Weicken studied medicine at Ludwig-Maximilians-University in Munich and did her residency in neurology including intensive care and psychiatry rotations. After years of clinical practice and with the growing presence of AI in medicine she wanted to dive deeper into this field.
As Chief Medical Officer in the Department of Artificial Intelligence at Fraunhofer Heinrich Hertz Institute for Telecommunications in Berlin, her research focuses on finding solutions for the safe, fair and effective use and applicability of AI in health, which requires an interdisciplinary approach.
The AI department specialises in explainable AI and applied machine learning, efficient methods for AI, standardisation and quality assessment for AI in health and other domains.
Lack of internationally accepted standards
She said digital health technologies, in particular AI, were progressing rapidly and playing a transformative role in health care, for example through applications in diagnostics, therapy and clinical workflows. AI can assist in addressing the shortage of healthcare professionals, especially in remote regions.
“However, progress in data-driven health solutions is hampered by the lack of internationally accepted standards for quality assessment to ensure their safe, effective and equitable application,” Weicken said.
The Global Initiative on AI for Health is a continuation of the ITU/WHO Focus Group on AI for Health (FG-AI4H, established in 2018, which ended in September 2023). FG-AI4H established working groups on ethics, regulation, clinical evaluation and data specification, which documented best practices and made open-source software available for independent assessment of medical AI solutions in line with the UN’s Sustainable Development Goals.
Patient-specific 3D models to assist in surgery
UNSW engineers have their sights on developing anatomically accurate 3D printed models which...
Alfred Health deploys GE system to optimise operations
The system is designed to enhance situational awareness, communication, and overall operational...
DHCRC project to deliver benchmarking tool for AI in health
The initiative complements efforts by governments, peak organisations, and clinical professional...