Date: September 10th, 2018

Reference: Gurley et al. Comparison of Emergency Medicine Malpractice Cases Involving Residents to Non-Resident Cases. AEM September 2018

Guest Skeptic: Dr. Justin Morgenstern is an emergency physician and the Director of Simulation Education at Markham Stouffville Hospital in Ontario. He is the creator of the excellent #FOAMed project called First10EM.com

Case: You are giving an introductory lecture on evidence-based medicine to the incoming class of residents, and after you finish you notice some excited chatter at the back of the room. Thinking that you have found some EBM keeners/gunners, you wander over to join the discussion, but find yourself in a heated discussion. One of the senior residents was recently named in a lawsuit, and the junior residents are worried. How likely are they to be sued? What can they do to prevent such a harrowing event? The residents turn to you, hoping that you can provide some insight on this topic.

Background: Unfortunately, physicians are not perfect. Mistakes are made occasionally, and those mistakes can harm our patients.

Medical care provided by trainees involves some added risks. In an internal medicine setting, problems with handoffs, teamwork, and lack of supervision were identified as issues in trainee malpractice cases.

In Canada, we have a national organization called the Canadian Medical Protective Association (CMPA). The CMPA has approximately 97,000 members representing 95% of Canadian physicians.

There are about 10,000 files opened every year with 38% involving payouts. Only 8% of cases end up in court. There has been a 5% decrease in cases over the last decade.

It is important to note that our medical-legal environment in Canada is much different than in the United States. It is a much more litigious system south of the border. The paper we will be talking about today come out of the US.

Clinical Question: What are some of the key factors in malpractice claims against trainees, and how do those compare to malpractice cases that don’t involve trainees?

Reference: Gurley et al. Comparison of Emergency Medicine Malpractice Cases Involving Residents to Non-Resident Cases. AEM September 2018

  • Population: The Comparative Benchmarking System (CBS) database: a large database of malpractice claims covering more than 400 hospitals and more than 165,000 physicians.
  • Intervention: Malpractice claims involving trainees (residents) in an emergency department setting over a three-year period from 2009-2012.
  • Comparison: Malpractice claims not involving trainees in the same time period.
  • Outcomes: Coded information covering a number of domains.
    • Average Payment
    • Case Severity (low, medium, high or death only)
    • Allegation Category (Diagnosis Related, Medical Treatment, Surgical Treatment, Medication Related or other)
    • Procedure Involved (yes/no and if yes what procedure)
    • Final Diagnosis (ex: cardiac related, orthopedic related, etc)
    • Contributing Factors (ex: communication, clinical judgement, documentation, etc)

Dr. Kiersten Gurley

This is the first SGEMHOP for Season#7 and of course we have the lead author here ready to give her conclusions to the study and Talk Nerdy to us.

Dr. Kiersten Gurley is a Clinical Instructor at Harvard Medical School. She is also an Attending Emergency Physician and Assistant Quality Improvement Director in the Department of Emergency Medicine at Beth Israel Deaconess Medical Center.

Authors’ Conclusions:“There are higher total incurred losses in non-resident cases. There are higher severity scores in resident cases. The overall case profiles, including allegation categories, final diagnoses and contributing factors between resident and non-resident cases are similar. Cases involving residents are more likely to involve certain technical skills, specifically vascular access and spinal procedures, which may have important implications regarding supervision. Clinical judgment, communication and documentation are the most prevalent contributing factors in all cases and should be targets for risk-reduction strategies.”

Quality Checklist for Observational Study:

  1. Did the study address a clearly focused issue? Yes
  2. Did the authors use an appropriate method to answer their question? Yes
  3. Was the cohort recruited in an acceptable way? Yes
  4. Was the exposure accurately measured to minimize bias? Yes
  5. Was the outcome accurately measured to minimize bias? Unsure
  6. Have the authors identified all-important confounding factors? Unsure
  7. Was the follow up of subjects complete enough? Yes
  8. How precise are the results? Unsure
  9. Do you believe the results? Yes
  10. Can the results be applied to the local population? Unsure
  11. Do the results of this study fit with other available evidence? Unsure

Key Results: There were 845 malpractice cases identified, 113 (13%) of which included a resident. In 45 cases (40%) the resident was the only person named.

  • The average incurred losses were $51,163 for resident cases and $156,212 for non-resident cases.
  • Majority of cases were high injury severity which included death, permanent grave, permanent major or permanent significant injuries.
  • The majority of cases were also a failure, delay or misdiagnosis.
  • A procedure was involved in about 1/3 of cases.
  • Residents had more cardiac diagnoses while non-resident cases had more orthopedic diagnoses.
  • Clinical judgement was thought to be involved in about ¾ of the cases.

1) Observational Study: This is a cohort study comparing cases that involve residents and cases that do not involve residents. No causation can be drawn in this type of study design. Residents were not randomly assigned to work or not work the shift and then look at whether there was a malpractice claim.

This is accurate, we cannot draw causation but can only report an association.

2) Other Confounders: Residents tend to work in academic centers and might see a different type of case load than staff physicians. Might the type of patient seen, or hospital environment have acted as confounders to this research?

Yes, this is a potential confounder- residents could certainly see a potentially different case load.  This is insurance company data. It is pre-codified data and as such we cannot look at the details of each specific case. For example, the overall milieu in the department at the time, the physician’s case load and the overall acuity in the department are all unknown. These ‘deep dives’ we cannot take however, I think this limitation is partially offset by our ability to look at such a large HIPPA compliant data set in a unique way about a topic that I think most EM residents are not aware of.

3) CRICO: You used the data from the Controlled Risk Insurance Company (CRICO) Strategies’ division of Comparative Benchmarking System (CBS). While it is the largest database of this nature it only represents about 1/3 of all malpractice cases in the USA. Has the reliability of the data in this database ever been measured?

In the US with so many malpractice insurers and hospital/healthcare systems a study that captures 1/3 of cases is considered quite large and does extend from coast to coast. Aside from the fact that case data adheres to industry standards I am not aware of any formal reliability standard studies that have been performed.


4) External Validity: Malpractice setting vary significantly between countries, and even between states. How generalizable do you think these results are and to what practice environments?

I suspect that about 10-15% of US cases will name a resident no matter where you are regionally that plays out in the dataset. Our data is from all over the country. Could there be variation from state to state and between countries based on malpractice law and precedent? Absolutely. The bottom line is the resident is at risk and that certain procedures need to be carefully supervised and may create a liability. High risk cases like the cardiac cases may be more common in places where residents are working at the big academic centers. In general, however the overarching message is that the resident risk profile is mirroring that of attendings.

5) NAIC Severity Scale: Severity of outcomes was rated using something called the NAIC severity scale. What is the NAIC Severity Scale, and has it ever been validated?

The NAIC Severity Scale is an industry standard derived from the Severity Scale of the National Association of insurance commissioners. It is used across specialties in the malpractice/legal world to categorize injury severity for each case. All cases in the database are assigned a severity score which assess the severity of the outcome of the claimant’s injuries allegedly caused by the event on a scale of 0-9. There are very specific definitions for each category however I do not know if this was validated prior to its widespread utilization in the insurance world.

6) Multiple Comparisons: There are a large number of comparisons made in this study. I would expect some differences to be found by chance alone. Were any statistical adjustments made for multiple comparisons to investigate this possibility?

There were not statistical adjustments made between the comparisons as this was an observational study and beyond the scope of this project.

7) Fragility: Some of the statistical differences could be due to the small number of observed events. An example of this would that vascular procedures were statistically more common in cases with residents (3%) compared to non-resident case (0.1%). However, these are small numbers – 3 total cases as compared to 1 – so it would only take a couple of extra cases to change the result. We should be cautious about over-interpreting these observations.

I agree with this conclusion. As mentioned in the limitations- the total number of malpractice cases related to specific procedures is small as was the number of cases overall that involved residents. I think the more robust conclusion that we can reach is that overall risk profiles are similar between residents and attending physicians and that both are at risk.

8) $100,000: The average payment was $100,000 lower if a resident was named. That is a very interesting finding. Some might interpret that as residents being protective against larger payments. Other might say that residents are worth $100,000 (I think they are invaluable and prefer to have residents on shift). What hypothesis to you have to explain this observation and are there any plans to further investigate this finding?

I would like to think that having a resident is protective as I too find them invaluable, however we do not know the cause of the lower average incurred losses. We are unable to dive into case specifics and money allotments however we do plan on looking across specialties and should be able to see if this holds true in other high-risk specialties.

9) Law vs. Medicine: It is important to make a distinction between being sued, which is what was measured here, and an actual error occurring. It would be interesting to take these cases and have them reviewed by peers and told them some of the cases involved law suits and see how many they would identify as having fallen below the standard of care.

I totally agree and think this is an important distinction to make. We are working on several other QA projects looking at error in EM, however are unable to perform peer reviewed case analysis of EM malpractice cases at this time.

10) Anything Else: Is there anything else you would like to tell the SGEMers about your study or about being sued?

I think the most important thing to focus on is that there is a risk of being sued for both resident and attending physicians is real and that the overall case profiles are similar amongst both cohorts. Patients safety efforts should encompass the entire care team and focus on clinical judgement, communication and documentation. Increased supervision of residents during procedures has the potential to reduce risk and frankly cannot be a bad idea.

Comment on Authors’ Conclusion Compared to SGEM Conclusion: We agree that there are larger average losses when residents are not involved. While statistical differences were reported for some observations, we are skeptical and caution against over interpretation. This is due to the small numbers involved, multiple comparisons and fragility of some of the results. However, we do agree that clinical judgement, communication and documentation are the most prevalent factors contributing in all cases.

SGEM Bottom Line: You can make no mistakes and still be sued.

Case Resolution: One of the limitations of evidence-based medicine is the lack of high-quality data. You encourage the residents to focus not on being sued but in providing great care in a kind and compassionate way.

Dr. Justin Morgenstern

Clinical Application: It is hard to clinically apply this data due to the limitations of the methodology and also how much of being sued will depend on your local medical-legal practice environment.

What Do I Tell the Resident? I would tell the residents that medical malpractice is very complicated. After reading this study, I am still not aware of any technique that is 100% protective from lawyers. I will continue do my best to be kind, curious, and understanding with my patients, and use resources like the Skeptics’ Guide to Emergency Medicine to make sure I am constantly learning and improving in an admittedly difficult profession.

Keener Kontest: Last weeks’ winner was Kristi Cox an EM nurse from Iowa. She knew Vapotherm was the company that patented the concept of heated humidified high flow therapy via nasal cannula in 1988 after being developed for use in race horses?

If you know the answer send an email to TheSGEM@gmail.com with “keener” in the subject line. The first correct answer will receive a cool skeptical prize.

SGEMHOP: Now it is your turn SGEMers. What do you think of this episode on emergency medicine malpractice cases involving residents? Tweet your comments using #SGEMHOP. What questions do you have for Kiersten and her team? Ask them on the SGEM blog. The best social media feedback will be published in AEM.

Also, don’t forget those of you who are subscribers to Academic Emergency Medicine can head over to the AEM home page to get CME credit for this podcast and article. We will put the process on the SGEM blog:

  • Go to the Wiley Health Learning website
  • Register and create a log in
  • Search for Academic Emergency Medicine – “September”
  • Complete the five questions and submit your answers
  • Please email Corey (coreyheitzmd@gmail.com) with any questions or difficulties.

Remember to be skeptical of anything you learn, even if you heard it on the Skeptics’ Guide to Emergency Medicine