How will artificial intelligence in medicine affect personal injury law? Lederer Nojima explains.
AI in Medicine
Artificial intelligence will increasingly be used as a tool for medical diagnosis. The technology will be able to deliver results far more accurately and rapidly than a trained doctor alone. The results will be a boon for all humankind as diseases are caught before they develop too much, resulting in better outcomes as treatments are brought to bear.
However, as with any other technology, AI can fail, leading to harm to the patient. How does one sue for a personal injury if a patient is harmed by a malfunctioning AI diagnostic tool?
MegaGadget suggests that a personal injury suit result from the failure of an AI tool will more resemble product liability action than a malpractice suit levied against a doctor or hospital. One cannot sue the AI tool directly because it is not a person, even though it has taken on the functions of a human doctor. However, one can and likely will sue the manufacturer of the AI tool if it returns a faulty diagnosis.
In the brave new world of AI medicine, an artificial intelligence diagnosis tool or clinical decision support system (CDSS) based on machine learning that returns a faulty diagnosis would be treated the same as – say – a pacemaker that malfunctioned, placing a patient’s life in danger. In most cases the doctor using the tool would not be at fault.
Learn More About Personal Injury Law and Medical AI
Of course, as with most things, the treatment of an AI diagnostic tool as a product is a simple way to approach the question of who is at fault if something goes wrong. Using the example of a pacemaker, implants can and have been installed incorrectly, causing a personal injury, leaving the surgeon exposed to a traditional medical malpractice suit. An AI tool could be used inappropriately. If it returns results that a human doctor knows or should have known are faulty and the doctor uses the results anyway, the physician can still be held liable.
For more information contact us.