Friday Jokes

Friday Jokes are memes, videos, and anything funny! Tune in every week for another joke that may (or may not) relate to root cause analysis.
Emergency… Stop?
04/02/2026
If your focus on safety falls through the floor… So will your company.
Cases like this are rarely the consequence of individuals’ laziness or maliciousness. Safety failures can manifest among well-intended teams for a number of reasons:
👥 Groupthink
In a fast-paced team, an operator might think confrontation will be misinterpreted as conflict. Leadership needs to clearly establish safety as a more important value than production or revenue.
🤷 Diffusion of Responsibility
Without strong guidance, your team can struggle to address a safety hazard — no matter how pressing. Operators need clear instructions and reminders on how and why to report a safety concern.
❓ Unknown Consequences
Even if team members know how to relay an issue, the mere perception of discipline can obstruct honest reporting. Reward operators for coming forward. Implement safeguards, not punishments, for their teams.
Back to 1984…
03/27/2026

How many times do we have to say this… Human error is NOT a root cause!
Despite common belief, human error (or pilot error) is not the primary root cause for any airline crash. It is a causal factor.
Do the semantics really matter? 🤓
Yes. If human error is thought as the root cause of an incident, the investigation team doesn’t have enough information to create a meaningful corrective action.
When human error is understood as a casual factor, the team is prompted to find systemic issues that facilitated the mistake.
Perhaps the myth that human error is a root cause is the reason for its overbearing presence in the aviation industry. 🤔
Interesting…
03/20/2026


Computer-based training: Is it a decent choice or just a déjà vu?
No matter your line of work, you likely have to perform some processes on a computer. If your software necessitates training, how do we balance the line between too little and too much?
⌨️ Consider frequency of use.
Regular users of your software likely don’t need training as extensive or frequent as beginners.
Competency-based training is a good choice to appease everyone. The length of your retraining varies based on the skill-level.
🔔 Update users on changes nonintrusively.
Even highly requested quality-of-life changes can cause frustration if they’re under- or over-communicated.
Every user should be notified about any major changes to the interface, but retraining should perhaps be available upon request rather than required for everyone.
🔧 Understand training isn’t a fix-all for overcomplexity.
The amount of induction will vary based on the complexity of your work. However, training isn’t a magical solution for a bad human-machine interface.
A clunky, hard-to-use program is going to facilitate mistakes, no matter how many resources you burn on training.
At the end of a day, computer-based training is still training: one of the many, many tools that improves human-performance.
Giving Only a Shrug
03/13/2026

“How am I supposed to conduct an investigation without any training?” Well… 🤷
We’ve heard this story too many times: a safety assistant is handed some paperwork, told to do an investigation, and receives no further guidance.
Incident investigators need proper induction because:
💡 Event-learning is an opportunity, not a formality.
An investigation shouldn’t be something thrown onto you for the sake of “getting it done”, especially if it impacts the safety of your team.
Formal training helps you look past surface-level symptoms (like human error) and make meaningful corrective actions.
🧠 Intuition and intelligence will only get you so far.
While being trusted to correct a large issue at your company must feel flattering, it’s also overwhelming.
No matter how smart you are, you can’t investigate every corner of human performance without some guidance.
❌ Trail-and-error doesn’t bode will in safety.
Hands-on experience is a great way to learn in low-risk environments. When it comes to safety, though, failure isn’t acceptable.
For the sake of your team, you should know the best practices before stepping into an investigation.
There’s TWO of Them!?
03/06/2026

Safety I and Safety II:
There’s TWO of them!? 😱
You’ve likely heard these terms before. They were coined in the early 2010s by a professor of patient safety, who preached for Safety II to be used alongside Safety I.
There are a lot of concepts behind Safety II, but these are the forward-thinking ideas that overlap with TapRooT® RCA:
✔️ Learn from what went well.
Safety II advocates to be proactive about system management, focusing on both successes and failures.
✔️ Humans are the solution, not the problem.
All systems rely on people to some extent, so we should understand our strengths and limitations.
So, is TapRooT® RCA officially Safety II? No. We disagree with some unsung philosophies of Safety II:
❌ No problem is intractable.
Safety II celebrates unofficial workarounds (resilience) to underlying issues, believing some problems to be too complicated to ever be solved systemically. With TapRooT® RCA, we apply our human factors skills to simplify overly complex systems.
❌ Human error is not unmeasurable.
Because systems are not “bi-modal”, according to Safety II, you can never have an accurate scope of human variability. We strongly disagree. A strong management system can measure human performance.
🔑 The key takeaways:
There are great ideas in Safety II, but they’ve existed long before the 2010s. There are also some not-so-great ideas that muddy up system management in the real world.
Anyone can slap a “2” onto something. That does not make it official or better.
Breathe In…
02/27/2026

Sharing your investigation report isn’t the final step. You still need to convince the team to implement your corrective actions!
If you’re having trouble getting everyone on board with your findings, here are a few tips:
📋 Keep your receipts.
We strongly recommend maintaining all evidence in a central repository.
This not only keeps your investigation organized, but it allows you to show your work when the legitimacy of any evidence or process is questioned.
✔️ Cater your reports to your reviewers’ needs.
Disagreement about your proposed corrective actions might stem from confusion about the RCA process.
Include as much (or as little) information in your report to remove any room for confusion. You may need to develop multiple reports to match the individual needs of each reviewer.
🗣️ Keep communication open.
Communication is a two-way street. Just as you expect reviewers to listen to your proposals, you need to take feedback with open arms.
Clarifying your decisions or making compromises can turn heated debates into productive conversations.
We know that gridlock can be frustrating. That’s why we designed the TapRooT® Software to be as intuitive as possible. All evidence, RCAs, and reports can be built and accessed from one place.
It is not!
02/20/2026

Okay, technically, near misses are not SIFs. 🚬
You should still treat every PSIF like a SIF, though!
In fact, in the TapRooT® Books, we don’t refer to PSIFs as “near misses” or “close calls”. We call them “precursor incidents”.
This is because a PSIF reveal the holes in our systems.
If the incident was only prevented by a couple of safeguards (or dumb luck), that means all the other safeguards have failed.
As such, the event preludes to what’s to come if we don’t immediately find and fix the root causes: an actual SIF.
Forgetting Something?
02/13/2026

“If you forgot, then it wasn’t important.” 🤷
When it comes to safety, this couldn’t be further from true.
Workers in hazardous fields have to remember a lot of information, from everyday tasks (like wearing PPE) to unfamiliar situations (like emergency procedures).
How can we help workers remember what’s important?
🧹 Clean Up the Clutter
More paperwork to review means more knowledge to cram. Time spent filling or reading redundant papers — or unneeded details of important papers — slowly dampens the memory of more critical tasks.
Review your documentation and consider cutting anything that is not critical.
🤖 Minimize Reliance on Human Memory
Human memory is deceptively unreliable. Checklists can go a long way to take the burden of memorization off the workforce.
Filling in a checkbox is a lot easier than trying to remember individual steps or tasks.
🗣️ Demonstrate the Importance
The Illusion of Safety in Familiarity causes us to naturally grow complacent with hazards we’re regularly exposed to. It’s important to remind workers about the importance of day-to-day measures, like LOTO, by initiating two-way communication and rewarding safe behaviors.
Furthermore, irregular protocols (like emergency protocols), require occasional simulation training to ensure that memory isn’t lost.
Change Your Perspective
02/06/2026

Incident investigations: are you looking on the bright side?
Event-learning takes humility. You have to admit the mistakes of your team and your system, which can happen right under your nose.
This can be frustrating, but it’s important to remember the purpose of investigations: to learn and improve! Every causal factor that you find brings you one step closer to a robust set of corrective actions.
So, during your next incident investigation, remember: you’re not dwelling on mistakes; you’re strengthening your systems!
Fishy Judgement
01/30/2026

“I would NEVER take a shortcut like that!”
That sounds fishy! 🎣
Actor-observer bias is a mental heuristic everyone experiences.
It’s the tendency to explain others’ mistakes due to internal factors, like personality or competence, but to justify our own errors with external factors, like deadlines or workload.
The hard truth is that we’re all tempted by shortcuts. Internal characteristics play a role in our decision-making, but incidents will happen again and again if there are too many external factors enticing unsafe behaviors.
If your workforce is taking shortcuts, don’t be so quick to reel the investigation in. Correct systemic faults before jumping into disciplinary action.