Take Two Advil While I Check ChatGPT – Using A.I. in Medical Diagnoses and Treatment 


By Robert Bender, Jr.

Another day, another story about ChatGPT and its potential to shape a profession.  According to a recent study by JAMA – The Journal of the American Medical Association – ChatGPT outperformed practicing physicians when answering patient questions.  The study, coupled with another JAMA study addressing the chatbot assistant’s performance answering board certification exam questions, highlights ChatGPT’s capabilities and possible benefits.  However, these benefits only extend as far as the end user’s responsible use and compliance with all laws, HIPAA being the foremost concern for healthcare providers. 

 Given the public’s infatuation with ChatGPT since its November 2022 launch, it comes as little surprise the medical profession joined the bandwagon to test the chatbot’s ability and to assess its value in everyday practice.  Perhaps less surprising, the chatbot demonstrated those abilities and exposed areas where it can enhance or improve the status quo.  In the first JAMA study, ChatGPT provided better quality answers than practicing physicians when responding to questions sourced from social media and scored significantly better when comparing the responses for empathy.  JAMA’s second study tested ChatGPT’s ability to answer ophthalmology board certification questions in January 2023 and February 2023, noting a marked improvement in February 2023 (58 percent correct against 46 percent correct).  Looking at the studies, ChatGPT can conceivably help treating physicians diagnose a condition and explain the diagnosis and treatment options with a patient.  It comes as no surprise some physicians are using the chatbot to assist with diagnoses and with responding to patient questions.   

Putting aside the well-documented concerns from the “we don’t want the machines to take our jobs” chorus, ChatGPT presents risks, especially in a highly-regulated profession like healthcare.  The trope “everything on the internet is forever” resonates in the case of ChatGPT, where information shared is used to refine the chatbot’s model.  Consider the recent instance where Samsung employees leaked internal code by uploading it to ChatGPT.  The fallout was limited to Samsung, which suffered an embarrassing leak of internal confidential information and since banned use of generative A.I. for some staff functions.  A physician indiscriminately uploading patient information without the necessary measures in place risks more than an embarrassing internal leak – such disclosure may constitute a data breach under HIPAA.  Federal regulators continue to apply scrutiny and to pay attention to A.I., as evidenced by a joint statement from several federal agencies addressing concerns about potential biases in A.I.  While Health and Human Services was not a signatory, it no doubt is watching recent trends, especially as physicians and other healthcare providers rely more on generative A.I like ChatGPT. 

While A.I. can be a brave new world or frontier, the same premises hold true.  HIPAA compliance remains as essential as ever.  The same concerns with sharing sensitive patient information apply, whether with a chatbot assistant or an individual.  Physicians should always remind themselves nuts-and-bolts questions like what information can be shared, for what purpose can it be shared, and to whom can it be shared when handling PHI.  Beyond the end-users, healthcare providers would be wise to review and to update standard policies and procedures for handling PHI as necessary.  The proper business associate agreements must be in place.  For events where information is improperly shared, a comprehensive response plan is also critical.  So healthcare providers looking to leverage generative ChatGPT or other generative A.I. (or any other third-party services for that matter) should remember the fundamentals: develop and implement policies governing PHI use, transmission, and storage; ensure the proper documentation is in place; review frequently; and educate, educate, educate. 

If JAMA’s recent studies are any indication, the role of ChatGPT and A.I. will increase in the medical profession.  Basic HIPAA compliance remains the best way to ensure these tools are utilized properly.  In other words, everything changes but nothing changes. 

For more information, please contact Robert Bender, Jr. at, or your local FMG attorney.