Impact of AI on the Practice of Medicine: Physicians Perspective
To distinguish ourselves from AI in clinical medicine, we must become better teachers, capable of explaining things on a level and in a way that patients understand and relate to
Thank you for Subscribing to Medical Care Review Weekly Brief
To distinguish ourselves from AI in clinical medicine, we must become better teachers, capable of explaining things on a level and in a way that patients understand and relate to
We are at an age when what used to be sci-fi stories are starting to approach reality. When it comes to AI in medicine, I think of Star-Trek, where the doctor holds a scanner over the patient, and the computer can make a diagnosis and formulate a treatment plan after just a brief scan. The onset of AI will make changes to numerous industries, and many people look at that with apprehension, wondering whether they may still have a job next year. The purpose of this essay is to give my prediction of the impact of AI on the practice of medicine. I have been a practicing nephrologist for over 20 years, spending most of my time in direct faceto- face patient care and some resident teaching.
When it comes to the history of technology in medicine, it reminds me of the onset of EMR. First, it was facilitated by looking up lab and diagnostic test results; next note, writing templates, then dictation software and Telemedicine, especially during the pandemic. When it comes to documentation, EMR and existing technology certainly place more information in physician notes, but it also makes the note so much longer and burdensome to read that it actually communicates medically relevant information less effectively. Dictation software does reduce the time required for physicians to translate their thoughts into a legible format, but its error rate makes correction effort burdensome. Telemedicine improves access for patients and allows physicians to see more patients in a given time. However, a telemedicine experience cannot match a face-to-face visit experience in cases where complex information needs to be explained, and a difficult decision needs to be made. Technology does tend to improve efficiency and productivity, but it comes with its drawbacks, and in no cases has technology been able to replace a physician. In terms of pay and satisfaction with their career, modern-day physicians consistently score less than physicians 30 years ago. So, despite technological advancements improving physician productivity, why are modern-day physicians being paid less and are overall less happy than physicians 30 years ago? Well, I suppose that the insurance company, CMS, and society in general are also adapting to using technology to control healthcare costs, and part of the savings comes from paying physicians less. Upon the adoption of EMR, one of the main complaints from physicians against EMR was the demand for their attention, which took away the time and attention that they previously afforded to patients. Patients may also complain that the office visit is conducted with the physician ‘hiding’ behind a computer screen instead of listening to their concerns. So, EMR and technology in the medical field have the potential to diminish the satisfaction of a clinical experience for the patient and physician. A critical concern that people have about AI is whether it will take over their jobs and make their skill set obsolete. To answer that question, it is necessary to analyze AI’s and human strengths. The strengths of AI include a. Breadth of knowledge, b.) Memory, c. Organization and curation of information, and d. Speed. A human’s strengths are a. Ethics, b. An ability to entail trust (emotional bond/ connection), c. Placing different weight on conflicting information to gain a better understanding of the problem (insight), and d. wisdom. Let’s analyze the quality of wisdom further. By definition, wisdom is knowledge by experience. Another definition of wisdom is the ability to foresee/predict consequences and ramifications because of one’s previous experience. Can AI learn wisdom? I don’t think so at its current stage, as one cannot gain wisdom simply by being a voracious reader. AI excels at tasks with an objective in mind, but it is poor at evaluating the consequence of its recommendation from a different perspective (empathy). For instance, let’s assign AI the task of acquiring a company with the sole objective of maximizing profit or investment return. It may very well recommend a leveraged buyout of a company such as that done by a private equity group. It won’t care if such action carries a 90 percent chance that this will eventually lead to the dissolution of the company and the loss of jobs for all its employees. A wise human may not carry out such an action as he may view it as unethical. Now, let us apply the above understanding to the field of practice of medicine. AI may be very good at giving out differential diagnoses based on textbook symptoms. AI may be very good at diagnosing a 20-year-old with abdominal pain and a certain set of laboratory values. But for an 80-year-old patient who comes in with abdominal pain and a certain set of laboratory values associated with comorbidity, AI may not do such a good job. That is because AI relies on association using textbook symptoms and diagnostic test results, and an elderly patient with their comorbidity does not give textbook symptoms nor clear diagnostic test results. In the latter situation, a human doctor with extensive experience will do better. Making a diagnosis is one thing, but getting a patient to trust in a doctor and follow his/her diagnostic and treatment plan is an entirely different matter. I think an experienced human doctor will do much better in that department than AI. How could this understanding of the strengths & weaknesses of AI impact the education of our next generation of doctors? In the future, most intellectual professionals will learn to delegate mundane tasks to AI, thus freeing up time for us humans to do things that we tend to do better than machines. So, what kind of things do humans do better? In clinical medicine, what characterizes a good doctor is one who routinely educates their patients in ways such that the patients believe in, understand and trust their physician. A good doctor gives understanding to their patients in ways that free them from anxiety or uncertainty. To distinguish ourselves from AI in clinical medicine, we must become better teachers, capable of explaining things on a level and in a way that patients understand and relate to. That requires empathy, wisdom, insight and other human qualities. If we can achieve that, then we do not have to be concerned about AI replacing our doctors.