I honestly have issue with using ChatGPT to write medical software. I don’t know what your exact process is like but I hope you’re giving the code it generates extra scrutiny to make sure it really does what you put in the prompt. It kinda feels like the judge who used ChatGPT to determine whether to deny or grant bail.
> I honestly have issue with using ChatGPT to write medical software.
GP is talking nonsense. No developer is ever going to be able to say "not my fault, I used what ChatGPT gave me" because without even reading the OpenAI license I can all but guarantee that the highly paid lawyers made sure that the terms and conditions include discharging all liability onto the user.
GP appears to think that if he sells a lethally defective toaster he can simply tell his buyer to make all claims against a unknown and impossible to reach supplier in China.
Products don't work like that, especially in life-critical industries (I worked in munitions, which has similar if not more onerous regulations).
I'm sure all the time; all people and processes are fallible.
But that's also why documentation is so important in this space.
I spent 15+ years building software for pharmas that was subject to GxP validation so I know the effort it takes to "do it right", but also that it's never infallible. The main point of validation is to capture the evidence that you followed the process and not that the process is infallible.