AI: to boldly go…

AI is the next frontier facing doctors. Is it a friend or foe?

Eric Martin reports


The launch of OpenAI’s ChatGPT in November 2022 propelled the technology to the forefront of the collective consciousness, and the impact of the statements issued earlier this year by experts and industry leaders – one signed by none other than Elon Musk – asking for development to be put on hold, has ensured that AI is one of the hottest issues of our times. 

In July, the RACGP issued a statement calling for greater government oversight of the Australian AI industry, with President Dr Nicole Higgins declaring that “Australia’s GPs need to be empowered to adopt AI, that is responsibly developed and regulated, so we can improve our patients lives.”

Yet despite the hype, AI still seems remote to many in the medical community where the opportunity for practical applications that assist clinicians in diagnosis and treatment daily may be hard to quantify.

In fact, experts have criticised the slow approach that Australia has adopted, stating that the “national conversation on AI in health care has for now remained niche and low in priority.”

“With AI’s many opportunities and risks, one would think the national gaze would be firmly fixed on it,” Enrico Coiera, an NRMHC investigator from the Centre for Health Informatics at Macquarie University, said in an article published earlier this month in Perspectives.

“However, Australia lags most developed nations in its engagement with AI in health care and has done so for many years. The policy space is embryonic, with focus mostly on limited safety regulation of AI embedded in clinical devices and avoidance of general-purpose technologies such as ChatGPT. 

“There is currently no national framework for an AI-ready workforce, overall regulation of safety, industry development, or targeted research investment.” 

Mr Alex Jenkins

Medical Forum spoke to Director of the WA Data Science Innovation Hub, Mr Alex Jenkins, after last month’s 2023 Data & AI for Business Conference & Exhibition, held in Perth, to ask what’s happening with AI in the medical field.

People power

“Part of my role is to ensure that we promote and advocate for the people doing the work, so that the public have some visibility into this research. For example, there is a local medical company using AI for the diagnosis of coronary disease and they are doing some amazing work,” he said.

“There is also a medical imaging platform under development, which is being designed to operate inside a health facility to run AI and machine learning experiments.”

While there are an ever-increasing number of platforms being released, much of the development research is driven globally.

“There have been some eye-opening advances over the past few years, but probably the biggest impact in the field of life sciences so far has been Alphafold, an AI that solves the protein-folding issues,” he said.

“Protein folding, or understanding how that sequence of amino acids is structured in 3D, based on all the intermolecular forces, has been an active and open problem for 40 years.

“It is an important aspect of biochemistry for drug discovery and pharmacology, and it used to take a single PhD student about three years to figure out the structure of one single protein – it now takes about 15 seconds for Alphafold to find it.

“Three years to 15 seconds. And to the best of my knowledge, we had less than 200,000 experimentally verified protein structures around 2016, which went to 2 million and then 200 million in the space of a couple of years – so you can see the significance. 

“This is a problem that we have been trying to solve with computers for over 30 years using traditional algorithms, and there is such a complex interplay of all the different molecular forces that we have never been able to crack until now.” 

Alex explained that AI was driven by a type of software called a neural network, loosely inspired by the human brain and the way neurons worked together. 

“It’s important to say it’s only inspired by the way that human cognition works because we don’t have a great picture of that,” he clarified. 

“And it is a much-simplified model of a neuron, but it is a parallel way of computing, and it requires an incredible amount of computing power, much more computing power than traditional software would require.

“You can think of traditional programming like baking a cake by following a recipe: you have a specific set of ingredients and a specific set of steps, and you must follow the steps, very linear, very predetermined, and then you have a cake at the end of it. 

Learn on the job

“With AI, we do not explicitly tell the algorithm what to do. It must learn the association or the representation of what we are doing from the data, from reward outcomes, or from correlations that we signal are important. 

“Some of the best outcomes in AI occur when we get a data scientist to cross-pollinate their input with that of a biochemist or an engineer, or a geologist. And if you look globally, that is where amazing things are happening in AI. 

“Health is a massive area for that development because it requires such a significant level of exploration – the real deep-domain specific problems that AI excels at.”

Alex said that the other application for AI, which was “extremely promising” in health, was synthetic data, and then promptly asked if I had seen one of the artificial intelligence models that create images from text? 

“So that might seem like a toy, it is all a bit of fun. But those same algorithms that create these photorealistic pictures from nothing can be used to create synthetic data, be it a fake MRI or fake tabular data, that is statistically representative of the original data set,” he said. 

“You can immediately see the application in how we share health data, bypassing the current issues around privacy and ensuring people’s privacy is protected. It is exceedingly difficult sometimes for researchers to access medical data, and rightly so, and this just sidesteps the whole issue by using what is effectively an anonymous data set.

Synthetic data

“If you look at the individual records within these data sets, they are all nonsense. But if you look at it as a whole, you’ll find the same statistical correlations as the original data set.”

Once the synthetic data was there, multiple teams around the world could work on the same problem, with no need for specific knowledge of a patient. 

“Just think about how much data is locked up inside a hospital system or medical imaging facilities, it would be astonishing if we could open this up for everyone to have a look at,” he said.

“These techniques have only really started to mature recently, and our organisation is going to run a hackathon with the WA Department of Health in October, to use these synthetic datasets. 

“We are going to give them to these bright young PhDs and professionals, to test if their conclusions from the synthetic data mirrored those in the real data. 

“There’s already researchers in King’s College London that have been using it to create synthetic MRIs of brains, which have exciting potential.”

He explained that the concept of a hackathon was to take a problem, then take a data set from that problem, and open it up to a wider community of professionals with fresh eyes. 

“Usually, they are software developers or people who have experience with data science. We just did a hackathon with fire and emergency services, and it is a wonderful way to get innovative ideas and new types of solutions to old problems.”

Alex explained that development in AI was often fast-tracked through the open-source community. 

“When it happens out in the open, it is astonishing how fast it happens. Problems just seem to drop week by week. It is a new way of doing science, it seems, and if we can open up health data in a comparable way, we expect research outcomes to accelerate,” he said.

“The other interesting concept under development is the idea of using an AI-based preclinical analysis to speed things up. Currently, people are required to fill in this enormous form about their personal details and medical history when they visit a new GP.

“Why not have an AI facilitate that process and then ask about their ailment, such as what they have done and how they are feeling, and then – without making a clinical judgment – the AI could summarise the patient’s symptoms and specific aspects of why the patient has presented, so that the doctor does not have ask the simple questions? 

“One way for doctors to think of the use of AI is to consider how a doctor goes through this enormous amount of training, then might work for 10 to 15 years before they are called a real expert in the field. 

“Now, if you can imagine, not just having 20 years of training in one field but having 20,000 hours of training or 20,000 years, or even 100,000 years, across multiple fields – the range of ailments that could be screened is vast, including rare diseases. 

AI backup

“It is like a second opinion from someone who is extraordinarily experienced across all medical domains – a clinical assistant that could accelerate what a clinician is able to achieve and the number of patients that they’re able to see over time.”

Alex said a surprise has emerged with the uptake of ChatGPT that people were finding it useful as a self-psychotherapy tool. 

“That’s intriguing because they tell things to a computer with no fear of judgment from the ‘person’ on the other side of that conversation,” he said.

“And while there are no official recommendations for that, I think it is a fascinating area for development that we will see emerge, but it is obviously no substitute for human relationships.

“Which comes back to the point that, even though it seems to be useful for psychotherapy, much of the job of a GP is to work with patients and make a judgment about their situation. 

“Whether they’re telling the truth or if there’s some information that’s being withheld or maybe there’s a domestic situation: those are the things that an AI is not going to be able to pick up, but they are critically important parts of being
a doctor.”