‘IT’S like Rolls-Royce building a full simulation of an engine, with millions of parts. They build a model and can simulate all kinds of failure and tolerance rates,” says Professor Ian Simpson.
The Edinburgh University academic is providing an analogy for digital twin technology and explaining how artificial intelligence (AI) is revolutionising healthcare.
To clarify, a digital twin is a computer model that simulates an object or process in the physical world and, with the help of AI, virtual models of human hearts can now be built, using data from their real-world counterparts on genes, proteins, cells and whole body systems.
“With digital twins in healthcare, we can build something like a heart, a heart model, and see what happens if, say, a heart is bigger, or a valve is faulty, “ says Simpson, a professor of biomedical informatics and director of an AI centre. “All of these things are happening with simulations in computers and you can learn things about the system and see whether they are valuable in practice.”
For many people, AI is ChatGPT, the text generator credited with starting the recent AI boom. To others, it’s the realm of science fiction, with robots replacing humans. Unsettling for some. But AI is becoming normal, a part of everyday life, and scientists are increasingly using the technology with the aim of improving Scotland’s health.
READ MORE: Calls to ban Israel arms exports after top Labour minister's comments
It’s a veritable technological revolution. But most people won’t know that AI in Scotland can be traced to the top-secret home of Second World War codebreakers. Indeed, the history of Scottish AI stretches back more than 60 years.
It was in 1965 that a research group was established at Edinburgh University under the leadership of Donald Michie. He’d been a member of a code-breaking group at Bletchley Park, and had worked as a cryptographer alongside Alan Turing, the famous mathematician and computer scientist.
Described as the father of British research into AI, Michie devoted his life to the development of computers to perform complex, human-like tasks.
By 1970, Edinburgh University was one of the few centres in the world working on AI and today – more than 60 years after Michie’s research began – the institution hosts dozens of scientists using the technology.
“We’ve been very much focused on health, public disease and patient data for the last five years, and trying to learn new things about diseases and how they progress,” says Simpson.
“We already have 57 PhD students here doing research, and we’ll have another 60, so it’s pretty massive. In total, we’re looking at nearly £20m in investment. We collaborate inside Edinburgh and across Scotland. It’s not just scientists – we’ve got nurses and clinicians learning about AI here, which is quite unique.”
In some quarters, AI is viewed as a game-changer. Its advocates include former UK prime minister Tony Blair who argued last week that AI “is the only answer to Britain’s productivity challenge”. He said it could help to “turbo-charge growth” in the UK, while AI in healthcare would bring new treatments for everything from cardiovascular disease and cancer protection to obesity drugs. “We need this urgently,” Blair said.
Is AI a panacea for the NHS?
So, can AI fix the NHS? Are we witnessing a technological revolution in Scottish healthcare? Can AI solve problems such as staff shortages and reduce waiting times for diagnosis and treatments? What are the challenges?
As part of the Sunday National’s Solutions for Scotland project, in partnership with The Ferret, we’ve been speaking to experts within Scotland’s AI sector to gauge what progress has been made, and what the future holds.
A specialised area of computer science, AI is based on data and algorithms, which are sets of instructions used to perform tasks, such as analysis or calculations. AI is defined by the Scottish Government as “technologies used to allow computers to perform tasks that would otherwise require human intelligence”.
AI systems are created by combining large amounts of data, with the information used to “train” a computer programme or algorithm to make decisions based on the data. AI algorithms can create “risk scores” to predict which patients might develop certain diseases.
Several AI applications are already used in clinical practice in NHS Scotland including tools for paediatric bone growth, the delivery of radiotherapy and speech recognition.
The Gemini project
GEMINI, a pilot project which developed and tested AI for breast screenings, recently finished at Aberdeen University. Researchers hope it can be used safely in hospitals, and address staff shortages and capacity issues.
Gemini is a collaboration between the University of Aberdeen, NHS Grampian and a private company called Kheiron Medical Technologies, which created AI software called Mia.
The pilot involved analysing 220,000 mammograms from more than 55,000 people to determine how well Mia could detect breast cancers. Currently, two experts examine each mammogram and decide whether someone should be invited back for additional investigations. Similar to a human expert, Mia can examine a screening mammogram and offer an opinion, which frees up time for staff.
The Ferret spoke to Aberdeen University’s Dr Clarisse de Vries who worked on Gemini. “The overall problems we have include a shortage of radiologists and the increasing numbers of people needing healthcare,” she tells The Ferret.
READ MORE: Climate activists blockade entrance to incinerator site in Aberdeen
“An ageing population means that there are more women who fall into the ages of 50 to 70. Overall, the NHS is under pressure. One of the ways to solve that is to get more radiologists, and more screen readers. The other alternative is AI, and there’s quite a lot of promise for artificial intelligence to take over some of the jobs.”
The Gemini team found that Mia would have suggested recalling 34.1% of women who went on to develop cancer in between screenings. Using current screening measures, these cancers would have remained undetected until the women developed symptoms.
“All of that was positive,” De Vries says. “The potential for AI to help support screening by detecting additional cancers and reducing workload is definitely there.”
However, there are problems to solve. These include AI’s inability to cope with changes to mammography imaging, which means the algorithm needs to be recreated if the clinical setting changes. Due to the amount of data that needs to be collected, this process can take several months.
Nevertheless, De Vries remains optimistic that science can overcome these technical challenges. She points out that AI is already used in Denmark for screening. “I don’t want to put a timeline to it but in 10 years, I think it might be implanted here in the UK too,” she adds.
AI and cancer detection
ELSEWHERE in Scotland, there are other AI healthcare projects ongoing. James Blackwood, an AI expert, has worked on 65 projects and advises health boards on how to use AI.
He cites a number of initiatives under way – bone-fracture and lung cancer detection, breast screening, MRI acceleration, reducing the number of stroke deaths and diagnosing heart failure within communities using AI. This is important, he explains, because if there are not enough scenographers in the community, “people need to be admitted into A&E to determine if they will have a heart attack”.
Blackwood is also involved with a project trying to diagnose skin cancer within 25 minutes of a lesion being pictured, and he mentions initiatives in Grampian and Glasgow which use AI to detect signs of lung cancer, citing evidence that the time to diagnosis can be reduced from up to 64 days to just 24.
“In terms of lung cancer, every week you get an earlier diagnosis equates to 1% improvement of survivability,” Blackwood says. “But we are also finding that about 600 more people in Scotland who have lung cancer have it detected by AI.
“AI detects the really small types of cancer that a radiologist may miss routinely. So that means 600 more people who are likely not to die every year – because of AI.”
Blackwood explains that “diagnostic AI” predominates at the moment – this operates with images and aims to give a clinical decision. But there’s also
“non-diagnostic clinical AI” which, he explains, is trying to understand issues such as readmission patterns and discharge management – so services can be planned better to get a patient out of hospital quicker, and for better communication with social care and local authorities.
AI research takes time, though. Generally speaking, Blackwood says, the time from concept to procurement is between five and 10 years. He feels that “good progress” is being made in finding solutions to problems within the NHS, but argues that Scotland is “lagging behind many countries” due to a “lack of leadership”.
Blackwood used to work for the Scottish Government but left because he didn’t believe it was taking AI “particularly seriously in healthcare”. He claims there are top scientists really pushing for research, but they don’t have money. “If you cannot release cash, then your project cannot go ahead. There’s nothing really being funded from operational or capital expenditure. Without that, you can’t adopt anything.
READ MORE: Multi-million pound contract awarded for Rest and Be Thankful investigations
“We lack somebody standing up and saying it’s important that we adopt AI in healthcare,” he claims. “We also lack any plan for adopting AI and we lack any form of co-ordination,” he adds, arguing that the “federal structure for healthcare in Scotland”, means each health board decides what it wants to do. “What this creates for us is three or four boards evaluating exactly the same type of AI which is a total waste of resources.”
Currently, the Scottish Government says it “aims to use existing leadership structures to promote the use of AI, rather than appointing an NHS AI champion”. The Government said in 2021 it would provide £20 million to develop AI but it has “since re-prioritised resources elsewhere due to fiscal pressures”.
Blackwood says complexity is another challenge. He says people assume it is easy to introduce AI into the NHS but points out the technology changes somebody’s job role, and clinical protocol, and has to be integrated into the system. “It’s a major change management initiative and, of course, you have to go through the bureaucratic process trying to get it through governance.”
Concerns over AI and data security
THERE have been patients’ concerns over AI replacing humans, including fears of the loss of the emotional element of the doctor-patient relationship. Data protection is also a challenge, given AI requires large amounts of data to work accurately. There are worries over hacking and cybersecurity, privacy and accuracy.
In 2021, researchers at Imperial College London found patients’ concerns over AI outweighed their perception of its benefits. The study highlighted concerns over job losses among radiologists, and over who might take responsibility if AI gets it wrong.
A report by the Scottish Parliament Information Centre (Spice) last month acknowledged such issues. It said that “sometimes new technologies fail to deliver desired benefits”, and that the impact AI will have on the cost of healthcare is “uncertain”. The large quantity of data needed to train AI, together with the fact that most AI development involves private companies, “raises questions relating to data protection and privacy”, Spice also noted.
There have been examples of data being wrongly sold. In 2017, the ICO found that the Royal Free Hospital, in London, had breached UK data protection law by giving patient data to the AI company DeepMind, for the purpose of testing a kidney failure detection app.
Spice cited this data breach in its report but said it was a failure to comply with data protection law, rather than the use of AI in itself. Spice said there is ongoing work in the public sector to improve data sharing and storage. “The intention is to build AI systems that would rely on already available data in closed environments. This reduces the risk of data breaches,” it added.
So, where does the data for AI come from and how is it protected? Albert King is chief data officer at NHS National Services Scotland (NSS), a public body providing data for health and social care in Scotland. The data, King says, is the “fundamental fuel that drives AI’’ and it is collated from health boards and local governments. NSS brings those datasets together in a platform and they are owned and managed by NHS staff, working in his team or Public Health Scotland.
READ MORE: Scottish and UK governments jointly fund £1.6m Grangemouth review
“It has been really clear that ministers will not be selling public data,” King says. “We also have a legislative regime that we have to comply with, and technological measures. We hold ourselves to the highest standard and any uses of data are scrutinised by the national public benefit and privacy panel that includes experts that ensure compliance. That is all recognised in the Scottish Government’s health and social care data strategy.”
King acknowledges there is “huge excitement” about the potential for AI but thinks there also needs to be a “measure of realism”. He says it is “not that straightforward sprinkling a little bit of that AI magic onto a problem” and points out, as James Blackwood did, that it takes time to move from innovation to adoption.
“I also think about investment in skills, not just in engineering and data skills but also in the ethics and the skills of the workforce,” King adds. “To make sure they have confidence using this technology, interpreting the feedback and results you get from these tools. Generally making sure we use those tools confidentially, as well as being safe and secure.”
The future of AI
SO, what might the future look like in 50 years time, more than a century after code-breaker Donald Michie started rolling the AI ball?
King says “the spark is lit” and that “opportunities to automate AI and release it for clinicians, nurses and front-line workers are really, really substantial”. De Vries believes that AI used in screening is perhaps a decade away, while Blackwood thinks we might be able to achieve “systemic AI”.
“That’s where AI is orchestrated by humans, where you tell it you need it to do this and this,” he says. “With things like ChatGPT and new technologies, what we might see in 50 years is that you will have a radiologist AI, pathologist AI or GP AI. Then you have an AI that co-ordinates the three different types of AI and effectively it will do a lot of the testing and diagnostic work and it will be able to present that back without anyone ever telling them to do it. This would be transformative.”
Edinburgh University certainly hopes to transform patient care, and – 60 years after codebreaker Michie’s work began – it remains in the vanguard. “It’s only in the last five to 10 years that the data you need has been available,” says Professor Simpson, who is “very excited” about the potential of digital twins and virtual organs. “It’s very early days for that – but it’s fantastic that it’s based here.”
Additional research by Leah Flint
Why are you making commenting on The National only available to subscribers?
We know there are thousands of National readers who want to debate, argue and go back and forth in the comments section of our stories. We’ve got the most informed readers in Scotland, asking each other the big questions about the future of our country.
Unfortunately, though, these important debates are being spoiled by a vocal minority of trolls who aren’t really interested in the issues, try to derail the conversations, register under fake names, and post vile abuse.
So that’s why we’ve decided to make the ability to comment only available to our paying subscribers. That way, all the trolls who post abuse on our website will have to pay if they want to join the debate – and risk a permanent ban from the account that they subscribe with.
The conversation will go back to what it should be about – people who care passionately about the issues, but disagree constructively on what we should do about them. Let’s get that debate started!
Callum Baird, Editor of The National
Comments: Our rules
We want our comments to be a lively and valuable part of our community - a place where readers can debate and engage with the most important local issues. The ability to comment on our stories is a privilege, not a right, however, and that privilege may be withdrawn if it is abused or misused.
Please report any comments that break our rules.
Read the rules hereLast Updated:
Report this comment Cancel