August 4, 2025 | Loralai Stevenson

Making a Mistake is Human; Catching a Mistake is Systemic

One of the most important lessons TapRooT® teaches is that good human engineering makes it “easy to do the right thing and difficult to do the wrong thing.” In dangerous workplaces, however, this is often not the case. Tragic mistakes are often made due to poor human engineering, and then blamed on the employee. Blame makes room for more people to make the same mistake, as the system itself is broken. But how do we create effective solutions for broken systems?

RaDonda Vaught

RaDonda Vaught at the TapRooT® Summit

The RaDonda Vaught case is a fairly recent example of this problem. After the death of a patient, Vanderbilt University Medical Center chose to place blame on a nurse who made an honest mistake — instead of fixing the computer system that facilitated her error.

Vaught, a nurse at VUMC, accidentally administered a paralytic agent to a patient (Vecuronium) rather than the prescribed Versed®. She, among all nurses in her workplace, was encouraged to override the computer system, riddled with error messages, to get medications to patients faster. This was because the software often lagged, and in this instance, did not show Versed® when she first looked for it.

In a medical environment, like many other dangerous workplaces, lives are on the line. Speed is a priority as much as safety is, but when safety and speed do not coincide, it puts employees in the place of making a difficult choice. VUMC did not set Vaught up for success, and after the mistake was made, made it even more difficult for the issue to be prevented in the future.

VUMC fired Vaught, choosing discipline as a corrective action. Even though Vaught immediately reported her error to management, VCUM filed the cause of death as “natural causes” in an attempt to cover up the incident. This blame-oriented decision-making hides accurate information from investigators, disallowing teams to develop stronger corrective actions that prevent reoccurrences.

In the words of one article on the Vaught case, “The conception of accidents as being easily avoided through greater attention, trying harder, or adherence to rules, is a naïve reductionist concept, serving only immediate purposes, and is still the dominant view of safety. There is not just a legal problem, but a wider systemic failure to understand and embrace what we know about safety within complex systems.”

Preventing Human Mistakes

Root Cause Analysis (RCA) is the method by which human mistakes like RaDonda Vaught’s can be avoided, not by firing employees, but by fixing the system in which they work.

A “root cause,” by our definition, is “The absence of a best practice or the failure to apply knowledge that would have prevented the problem, or significantly reduced its likelihood or consequences.” But how does one identify these root causes, and how do they prevent them from reoccurring?

Discover the TapRooT® Difference

The TapRooT® system is a strong RCA method developed with the expertise of numerous fields. We teach investigators to apply the right questions to their problems to reach answers and actions that are more effective than blame and punishment.

You can learn the TapRooT® method in our upcoming Calgary, Edmonton, and St. Johns courses. To learn more about what TapRooT® RCA is, request a free, one hour executive briefing.

To understand further why blame is not the answer to workplace incidents, check out our recent video on human error.

Categories
Root Cause Analysis
-->
Show Comments

Leave a Reply

Your email address will not be published. Required fields are marked *