If the cancer is located only in the breast, the 5-year survival rate of women with breast cancer is 99%. Sixty-two percent (62%) of people with breast cancer are diagnosed with this stage. If the cancer has spread to the regional lymph nodes, the 5-year survival rate is 85%. If the cancer has spread to a distant part of the body, the 5-year survival rate is 27%.Editorial Board, “Breast Cancer: Statistics” at Cancer.Net
But the trouble is, breast cancer is very common and finding it early requires radiologists to stare at, possibly, hundreds of mammograms a day. No matter what training they’ve had, that’s hard work to do consistently well. Statistically, radiologists miss roughly one in five breast cancers.
Researchers found that the AI system reduced false positives by 5.7 percent for US women — a significant improvement, when you consider how distressing it would be to be told you have cancer when you actually do not. It also reduced false negatives by 9.4 percent, meaning it caught instances of cancer that would’ve otherwise gone undetected.Sigal Samuel, “AI can now outperform doctors at detecting breast cancer. Here’s why it won’t replace them.” at Vox
Study co-author, Dr. Mozziyar Etemadi, summarized the results:
“While this is exciting, early-stage research, validation in future trials is needed to better understand how models like these can be effectively integrated into clinical practice,” said Northwestern study co-author Dr. Mozziyar Etemadi in a statement. In some examples, the human outperforms the AI and in others, it’s the opposite. But the ultimate goal will be to find the best way to combine the two—the magic of the human brain isn’t going anywhere any time soon.”Katie Collins, “Google Health’s AI can spot breast cancer missed by human eyes” at cnet
But then doctors catch things that AI misses as well. The programmers might want to train the AI to deliver few false negatives and more false positives. Using the AI to highlight what may be cancer tissue leaves the critical decision and analysis to the radiologist. With a reduced workload, the radiologist can focus more on ambiguous situations, reducing the chance of missing cancers in the critical early stages when they are comparatively easy to treat.
Another challenge for doctors and other health care staff is coding medical diagnoses—assigning numbers so as to transmit correct information. Correct and informative coding affects everything from subsequent treatment to insurance payments to the medical statistics that guide public policy. But getting codes right is really hard: Many conditions exhibit overlapping symptoms and the coding standard—ICD-10—has over 155,000 different possible codes. In the US version, roughly 70,000 apply to diagnoses alone.
As the number of codes has increased, medical staff have increasingly relied on “code books” to guide them in decision-making. Even so, interpretative problems can cause miscoding. Here’s how AI can help:
When [Elicilene] Moseley[, an 11-year veteran medical coder in Florida,] first began to use AI-enhanced coding a couple of years ago, she was suspicious of it because she thought it might put her out of a job. Now, however, she believes that will never happen and human coders will always be necessary. She notes that medical coding is so complex and there are so many variables, and so many specific circumstances. Due to this complexity, she believes that coding will never be fully automated. She has effectively become a code supervisor and auditor—checking the codes the system assigns and verifying if system recommendations are appropriate for the specific case. In her view, all coders will eventually transition to roles of auditor and supervisor of the AI-enabled coding system. The AI system simply makes coders too productive to not use it.homas H. Davenport and Steven Miller, “The Future Of Work Now—Medical Coding With AI” at Forbes
Moseley finds that AI helps her work faster but, like Watson, it can make choices that “make no sense.” The fast, good results come for Moseley when she works together with AI.
We design tools to extend our reach and help us where we’re weak. It’s good that medical technology researchers see AI that way rather than as a replacement for doctors.
If you enjoyed this piece, you may also enjoy these by Brendan Dixon:
AI should mean thinking smarter, not less. We should be all the more engaged when we use technology
Machines can’t teach us how to learn. A recent study used computer simulations to test the “small mistakes” rule in human learning.
Just a light frost? Or AI Winter? It’s nice to be right once in a while—check out the evidence for yourself
I am giving up cycling It’s just not worth it if a machine can beat me