Embracing Artificial Intelligence for Better Healthcare

by Chinmay Joshi, Genetics

Artificial intelligence (AI) is rapidly transforming healthcare, offering benefits to both medical professionals and patients. From informing clinical decision-making to optimizing treatment plans and reducing administrative burdens, AI has the potential to improve care delivery and ease physician burnout. While fears of automation displacing healthcare workers persist, the technology is more likely to support physicians by giving them more opportunities to directly interact with patients. Nurses and clinical support staff may face greater disruption, but their roles could shift to enhance patient interaction and shift focus toward holistic care. Patients’ trust in AI tools and proper integration with human oversight are essential for success. With thoughtful implementation and regulation, AI can augment rather than replace clinical care, supporting a more efficient, compassionate healthcare system that prioritizes patient well-being while addressing longstanding workforce challenges.

artificial intelligence, healthcare, automation, clinical decision support


As artificial intelligence (AI) continues to become more salient in the public eye, especially since the explosion of large language models such as OpenAI’s ChatGPT, its role in increasing automation in healthcare warrants careful consideration. Whereas the loss of employment is a major concern when automation is viewed broadly, its application to medicine should be viewed differently because of existing issues with medical care and the upside for patients, who are the ultimate priority. Automation need not kill jobs; it can act to relieve decades of pent-up pressure in a field plagued with dissatisfied professionals and worker shortages (Owens, 2024). As healthcare executives begin adopting AI in healthcare and regulators implement guardrails for implementation (Lamb, 2024; Schmidt et al., 2024), they should understand the full scope of potential advantages while considering its impact on existing roles in medicine, the quality of patient care, and how patients perceive its implementation. 

The Promise of AI in Clinical and Administrative Practice

Artificial intelligence has demonstrable benefits both to medical professionals and patients across its diverse applications in healthcare. The most eye-catching innovations are in diagnosis and treatment: machine learning algorithms have been shown to be effective at diagnosing breast and skin cancer, pneumonia, and appendicitis from radiography, predicting chronic conditions from genetic data, and optimizing treatments and clinical decision-making based on electronic health record data. Importantly, several of these algorithms matched or surpassed physician performance on similar tasks (Alowais et al., 2023). In addition to medical applications, AI can and is being used to address the large administrative workload associated with clinical practice. Record-keeping has proven to be a great burden to physicians who now have increased clerical responsibilities due to the widespread implementation of poorly designed electronic health record software contributing to physician burnout (Joseph, 2021). Ambient AI scribing, which relies on large-language models like the one powering ChatGPT, has already become popular among clinicians; it addresses documenting visit notes, a need that physicians ranked highly in a survey by the American Medical Association (2025), and allows them to more directly address patients instead of looking at their screens as they document the visit. Documenting visits is one administrative process of many—from responding to patient messages to filing and processing insurance claims—where AI has the potential to ease workloads for healthcare providers.

The currently observable benefits of AI to clinical practice and administrative work will continue to grow, making the integration of AI into healthcare seem highly appealing. Three key concerns with its implementation are how it affects existing roles in medicine, how it affects patient care, and how patients perceive its implementation, which I consider below.

The Impact of Automation on Physician and Non-physician Roles

Whether or not automation significantly encroaches on a healthcare professional’s work depends on the professional’s specific duties and involvement in directing patient care, making the job of a physician least likely to be automated. Since doctors are ultimately responsible for diagnosing a patient and determining their treatment regimen, they still possess many responsibilities even if assisted significantly by AI. Doctors must verify conclusions made by a diagnostic system and determine if a particular “personalized” treatment regimen is medically appropriate for their patients. With currently available technology, physician-machine teams can at best assume what philosophers Danaher and Nyholm (2020) call directive collaboration, where the physician ultimately determines what physical, computational, or diagnostic ‘grunt work’ the machine will have to do. There may be concerns about the satisfaction physicians might derive from their careers if their direct contribution to the patient’s treatment is lowered by harnessing the full capabilities of automation. Since the time required for administrative tasks often equals or rises beyond direct, face-to-face patient interaction (Tai-Seale et al., 2017), automating these tasks with AI would allow physicians to focus on the interactions from which they derive the most professional satisfaction (Horowitz et al., 2003). On the clinical side, the time and energy saved by AI “assists”—confirming a suspicious lesion on medical imaging, for instance—can be applied to more fundamental questions of whether a particular treatment path is appropriate for a particular patient or how best to encourage adherence to treatment plans or lifestyle changes outside the clinic. From a utilitarian perspective, reducing the workload in each patient consultation could work to produce a greater number of positive patient outcomes in the same amount of time. Ultimately, even if the volume of work required from physicians is decreased, the inherent value of their output remains the same: improved patient survival and quality of life.

In the case of nurses, medical assistants, and other members of a medical team, automation may encroach much further into their responsibilities. These individuals currently “implement” care under the directive collaboration framework, as they administer treatments and tend to patients based on instructions from physicians. Thus, their role is more subject to replacement by a combination of hardware and software. Administering medication, changing IVs, or even responding to an emergency with a patient are tasks that an automaton could conceivably perform upon instruction, given their more well-defined nature. With machine learning models that can learn with experience, carrying out a doctor’s care plan might eventually be accomplished by machines especially attuned to observing and listening to doctors and patients. A humanoid robot named Grace was developed during the COVID-19 pandemic and can take temperatures and measure responsiveness through talk therapy (Cairns, 2021). The tasks of a nurse are far more complex than communication and taking temperature, but the potential for growth in this area of medical technology highlights how non-physicians face a greater risk from automation. Though physicians would find more time for direct patient interaction while using AI, the inevitable constraint on their time means nurses and clinical support staff could also divert time dedicated to administrative and clinical work to speak directly with patients about their care plans, symptoms, and non-medical needs. This lowers the existential threat to these jobs but calls for an eventual redefinition of their roles to synergize with an AI-reliant clinical environment.

Automation and Bedside Manner

The importance of the human touch in caregiving cannot be overstated. The quality of a patient’s relationship with their physicians can impact their health over time (Olaisen et al., 2020); being positive and reassuring with patients has been shown to contribute to positive patient outcomes (Di Blasi et al., 2001). One study showed how placebo treatments for irritable bowel syndrome can significantly improve patient symptoms if the medical practitioner was more warm, attentive, and confident in their bedside manner (Kaptchuk et al., 2008). Another study showed that patients who believed they had arthritis were more satisfied in their interactions with practitioners of alternative medicine than they were with general practitioners (Ernst et al., 1997). In a healthcare system where technology can take care of a large proportion of current tasks, good patient interactions can be prioritized. Offloading the “knowledge management” and memorization required of physicians today can carve out space for training that specifically focuses on patient interaction, ultimately improving the care that patients receive from physicians and nurses (Johnston, 2018). Automation can elevate the humanity of the practice of medicine by allowing more time for a strong patient-provider connection to form. 

The Patient Perspective

A major aspect of adopting machines in the clinic, especially when they face patients, is ensuring that patients feel comfortable despite interacting with new technology. It is much more difficult to trust a machine than it is a human, and patients need to trust that they are receiving the best possible standard of care. This creates a significant niche for a “maintenance collaboration” with machines, where healthcare providers help phase machines into the clinic (Danaher and Nyholm, 2021). A potential role during the transition from all-human medical teams to those that involve machines would be for counselors who help patients interact with and adapt to receiving care from machines like Grace, the nurse robot. Employing human staff in such roles can help address the uncertainty and loss of control that both physicians and patients would experience if machines were allowed to act independently of human medical professionals. Automation would be quite counterproductive if it appeared to “gain a life of its own” and contributed to the public’s skepticism of physicians and scientists (Vredenburgh, 2022). Human oversight can help ensure that a reckless focus on implementing new technology does not fail to prioritize proper medical standards of care. It may also ease concerns within the medical field about the risks and liabilities of automating direct patient care.

Conclusion

Fostering humanity within medical care is very important when ushering in the rapid innovations in medical care that robotics and AI continuously produce. Rather than being sluggish in adopting technological change, the field of medicine has the opportunity to improve outcomes for both patients and physicians. By striking a balance between technology and human connection, the healthcare industry can embrace automation while maintaining the humanity and empathy that are at the core of caregiving. Ultimately, a patient-centered approach that leverages automation’s potential to improve care will shape the future of healthcare.

When thoughtfully integrated, AI can offload many of the time-consuming and impersonal aspects of clinical and administrative work, creating more space for human presence, deeper patient-provider conversations, and individualized care. Automation also offers the opportunity to rethink and revalue the contributions of every member of a care team—physicians, nurses, and clinical support—by allowing them to focus on uniquely human strengths and the patient-provider interaction. Throughout the process, ensuring patient comfort and protecting patient rights during this technological transition will be crucial.

Works Cited

Alowais, S. A., Alghamdi, S. S., Alsuhebany, N., Alqahtani, T., Alshaya, A. I., Almohareb, S. N., Aldairem, A., Alrashed, M., Bin Saleh, K., Badreldin, H. A., Al Yami, M. S., Al Harbi, S., & Albekairy, A. M. (2023). Revolutionizing healthcare: The role of artificial intelligence in clinical practice. BMC Medical Education, 23(1), 689. https://doi.org/10.1186/s12909-023-04698-z

Blasi, Z. D., Harkness, E., Ernst, E., Georgiou, A., & Kleijnen, J. (2001). Influence of context effects on health outcomes: A systematic review. The Lancet, 357(9258), 757–762. https://doi.org/10.1016/S0140-6736(00)04169-6

Danaher, J., & Nyholm, S. (2021). Automation, work and the achievement gap. AI and Ethics, 1(3), 227–237. https://doi.org/10.1007/s43681-020-00028-x

Ernst, E., Resch, K. L., & Hill, S. (1997). Do complementary practitioners have a better bedside manner than physicians? Journal of the Royal Society of Medicine, 90(2), 118–119. https://doi.org/10.1177/014107689709000226

He, J., Baxter, S. L., Xu, J., Xu, J., Zhou, X., & Zhang, K. (2019). The practical implementation of artificial intelligence technologies in medicine. Nature Medicine, 25(1), 30–36. https://doi.org/10.1038/s41591-018-0307-0

Horowitz, C. R., Suchman, A. L., Branch, W. T., & Frankel, R. M. (2003). What do doctors find meaningful about their work? Annals of Internal Medicine, 138(9), 772–775. https://doi.org/10.7326/0003-4819-138-9-200305060-00028

Johnston, S. C. (2018). Anticipating and training the physician of the future: The importance of caring in an age of artificial intelligence. Academic Medicine, 93(8), 1105–1106. https://doi.org/10.1097/ACM.0000000000002175

Joseph, S. (2021). The ehr is dead. Long live the ehr platform(1 Of 2). Forbes. Retrieved March 21, 2025, from https://www.forbes.com/sites/sethjoseph/2021/08/10/the-ehr-is-dead-long-live-the-ehr-platform-1-of-2/

Kaptchuk, T. J., Kelley, J. M., Conboy, L. A., Davis, R. B., Kerr, C. E., Jacobson, E. E., Kirsch, I., Schyner, R. N., Nam, B. H., Nguyen, L. T., Park, M., Rivers, A. L., McManus, C., Kokkotou, E., Drossman, D. A., Goldman, P., & Lembo, A. J. (2008). Components of placebo effect: Randomised controlled trial in patients with irritable bowel syndrome. BMJ, 336(7651), 999–1003. https://doi.org/10.1136/bmj.39524.439618.25

Olaisen, R. H., Schluchter, M. D., Flocke, S. A., Smyth, K. A., Koroukian, S. M., & Stange, K. C. (2020). Assessing the longitudinal impact of physician-patient relationship on functional health. The Annals of Family Medicine, 18(5), 422–429. https://doi.org/10.1370/afm.2554

Owens, C. (2024, June 7). The health care workforce crisis is already here. Axios; Axios. https://www.axios.com/2024/06/07/health-care-worker-shortages-us-crisis

Physicians’ greatest use for AI? Cutting administrative burdens. (2025, March 20). American Medical Association. https://www.ama-assn.org/practice-management/digital/physicians-greatest-use-ai-cutting-administrative-burdens

Schmidt, J., Schutte, N. M., Buttigieg, S., Novillo-Ortiz, D., Sutherland, E., Anderson, M., de Witte, B., Peolsson, M., Unim, B., Pavlova, M., Stern, A. D., Mossialos, E., & van Kessel, R. (2024). Mapping the regulatory landscape for artificial intelligence in health within the European Union. Npj Digital Medicine, 7(1), 1–9. https://doi.org/10.1038/s41746-024-01221-6

Tai-Seale, M., Olson, C. W., Li, J., Chan, A. S., Morikawa, C., Durbin, M., Wang, W., & Luft, H. S. (2017). Electronic health record logs indicate that physicians split time evenly between seeing patients and desktop medicine. Health Affairs, 36(4), 655–662. https://doi.org/10.1377/hlthaff.2016.0811

Tham, D. (2021, August 19). Meet Grace, the ultra-lifelike nurse robot. CNN. https://www.cnn.com/2021/08/19/asia/grace-hanson-robotics-android-nurse-hnk-spc-intl/index.html

The future of generative AI in healthcare | McKinsey. (n.d.). Retrieved March 21, 2025, from https://www.mckinsey.com/industries/healthcare/our-insights/generative-ai-in-healthcare-adoption-trends-and-whats-nextVredenburgh, K. (2022). Freedom at work: Understanding, alienation, and the ai-driven workplace. Canadian Journal of Philosophy, 52(1), 78–92. https://doi.org/10.1017/can.2021.39


Acknowledgements: Thank you to the UGA Writing Center for your excellent revision advice!

Citation Style: APA