Absolutely agreed on radiology. Earlier this year 3 different radiologists read my c-spine MRI and said I have slight bulging on C5-6 and nothing more. 5 different neurosurgeons* said sure, the C5-6 is bad but the real issue is the C6-7 herniated disc impinging on my nerves. I actually asked 2 of those 3 radiologists re-read the MRI and look for the C6-7 hernia and they couldn't find anything. All of the surgeons picked it up immediately.
I was told by a surgeon that this happens because radiologists are generalists (looking for strong evidence of different types of issues all over the body) while surgeons are trained to know the specific issues that happen in few parts, even if they don't show up clearly in MRI/x-rays.
AI should be able to take data from all the specialists to make a better generalist than human-trained radiologists. Integrated AI system should immediately read an MRI/x-ray/ultrasound and spit out possible issues. I can imagine an x-ray or ultrasound video feed hooked to the cloud that shows in real-time possible diagnoses and highlights the areas of concern. Ultrasounds are safe and this could even be a consumer device. Just like 3D-ultrasounds and 23&Me are for 'entertainment' and not medical solutions, ultrasound-with-AI can be a good tool for at-home what-ifs. It could be a great prenatal monitoring device.
* I know a lot of surgeons personally. Didn't cause a trillion dollar insurance claim.
As a doctor (not a raiologist), I believe your example shows the opposite, that is, it shows how hard it will be to automate radiology.
Radiology requires a "theory of the body", so to speak. You can't just look at the image in isolation. You often need detailed knowledge of the patient's clinical situation, and some actual reasoning. My guess is that that's why the surgeons got it right in this case (they are more familiar with the complaints of the patient and with the "live" anatomy of that region).
This doesn't mean that radiology can't be automated. It just means that to be a good radiologist, you might need to be a general artificial intelligence, capable of graduating from medical school.
This is different from something like classifying moles into benign, malign and high risk. That's something that can be determined from the pixels of a picture (even by human dermatologists, through experience or by following certain simple algorithms), and has no relationship to the rest of the patient. This means that automating mole classification is kinda like automating chess. Automating radiology looks more like automating the command chain for WW2.
On the other hand, pathology (looking at tissue samples through the microscope) seems much easier to automate. It relies heavily on pattern recognition and IMO (I'm not a pathologist either, although I've spent time in a pathology lab) it's less dependent on the clinical data of the patient. It's almost as if the doctor were looking at the image and nothing else, and the kinds of pattern doctors are something that might be automated. This is of course a simplification, and sometimes clinical judgement is important even in pathology.
None of this means that medicine can't be automated. I'm just trying to convey some of the difficulties you might have in automating radiology, as opposed to other areas of medicine.
And in any case, my criterion for difficulty of automating is "does it seem to require a general artificial intelligence or not?". If you have a general artificial intelligence completely indistinguishable from a human, then all bets are off.
That's great to hear! Did you have the disc replaced? I have similar pain in L5-S1 due to a disc protrusion affecting the nerves in that region (either via impingement or, more likely, inflammation). Unfortunately, the surgeon I consulted said that surgery is rarely performed that far down the spine unless there's really serious symptoms.
In the meantime, I keep monitoring these studies on mesenchymal stem cells for disc regeneration, hoping one of them makes it to clinical trials :(
I had disc replacement as well as fusion, since C5-6 is likely in next few years. I got lucky because C-spine is much easier to operate than lumbar. They go in from front (neck) and do not need to touch the spinal cord. For lumbar, they have to.
I was told by a surgeon that this happens because radiologists are generalists (looking for strong evidence of different types of issues all over the body) while surgeons are trained to know the specific issues that happen in few parts, even if they don't show up clearly in MRI/x-rays.
AI should be able to take data from all the specialists to make a better generalist than human-trained radiologists. Integrated AI system should immediately read an MRI/x-ray/ultrasound and spit out possible issues. I can imagine an x-ray or ultrasound video feed hooked to the cloud that shows in real-time possible diagnoses and highlights the areas of concern. Ultrasounds are safe and this could even be a consumer device. Just like 3D-ultrasounds and 23&Me are for 'entertainment' and not medical solutions, ultrasound-with-AI can be a good tool for at-home what-ifs. It could be a great prenatal monitoring device.
* I know a lot of surgeons personally. Didn't cause a trillion dollar insurance claim.