Afleveringen

  • Perceptions of Paradox
    Illusions of Professional Skepticism

    Perceptions of Paradox:   Illusions of Professional Skepticism
    When I saw Beverly Hall's picture, I was reminded of my favorite elementary school teacher, Mrs. Foreman. Hall was Jamaican- American and her smile, like Mrs. Foreman's,  seemed extraordinarily kind. She was the superintendent of the Atlanta public schools and named the national superintendent of the year in February 2009 by the American  Association of School Administrators. The honor had been earned by raising test scores,  including those on standardized tests that had been designed to hold teachers accountable as part of the No Child Left Behind Act.
    By October the test scores were under scrutiny after being deemed statistically improbable. The truth was, there was cheating going on. A lot of cheating, but this time it wasn't by students. Erasure patterns were discovered on answer sheets that made it appear as though educators had corrected student's answers immediately after the testing. Students that had learning disabilities suddenly became proficient in math and reading, and students who were considered gifted went from exceptional to perfect.
    In the fall of 2010, fifty agents with the Georgia Bureau of Investigation started questioning teachers about the conspiracy to falsify test scores. They considered Hall to be the ringleader. An accusation she vehemently denied. She allegedly offered cash bonuses to teachers who could meet the minimum score targets. The culture in the schools became one of unrelenting pressure to raise scores, and the cheating had reportedly been an open secret for years.
    All in all, investigators identified 178 teachers and administrators that had fraudulently corrected student's answers. Hall was indicted in March of 2013 and on April 1, 2015, eleven teachers were convicted on racketeering charges and led from the courtroom in handcuffs in what had become one of the largest standardized test cheating scandals in U.S. history.
    As I was reading research into the Atlanta cheating case I felt a little twinge of pushback when I learned that Beverly Hall was a target in the investigation. What I'd imagine it would feel like if someone had suggested to me that Mrs. Foreman was a cheater. This is a tell-tale sign of implicit bias that can sabotage your reasoning.
    I speak for experts that make high-stakes decisions as a routine part of their jobs and when I talk with them about professional skepticism, they generally see it as the opposite of being gullible.
    This view may seem intuitive but comes with serious risks of misjudgment when making critical decisions. In the real world, many of our judgments force us to choose between competing alternatives, not to simply evaluate a single item. If you need to make a choice, simply doubting all of the factors isn't helpful. Also, consider that the way we apply doubt is biased, we apply it unevenly. Evidence that meets our assumptions generally gets a free pass, while evidence that challenges our preconceived beliefs receives more scrutiny.
    If your view of professional skepticism is simply setting a higher bar before you believe something, your judgment process is like a ticking time bomb. You're bound, at some point, to make a critical error and here's why. When we talk about skepticism, we almost always mean avoiding believing something is true when it isn't. This is a false positive or a type 1 error. We forget that a false negative, believing something is false when it's actually true, can be just as harmful. Effective skepticism should be a constant process. A screening process kind of like a signal to noise ratio that acts as a gauge for the relevancy of evidence and how we update our beliefs.
    Experts have a variety of tools designed to guide their judgment. Physicians use algorithms that inform them which treatment regimen could be the most effective for a particular patient. Engineers use root cause analysis to separate symptoms...

  • Social Immune Systems
    The Hidden Psychology of High-Stakes Organizational Decisions
     

     
    The hidden psychology of high stakes organizational decisions
              Some of the very best critical thinking research in the world has been done with flight crews and medical teams.  There is an almost endless number of fascinating cases to explore. There's one, though, that I've become more obsessed with than others.  It happened on Thursday, January 25th, 1990.  At 9:34 pm, after a long flight and multiple holding patterns, Avianca flight 52 was given clearance to land on runway 22L at JFK.  Wind shear was terrible that night and visibility was limited.  The crew found themselves descending quickly and only 200 feet off the ground more than a mile from the airport.  If you listen to the cockpit voice recorder, you hear the captain abruptly ask the first officer where the runway is, and the first officer answers “I don’t see it."  The ground proximity warning system gave them voice alerts to pull up, and they executed a missed approach.  Their go-around ended in disaster when they crashed 20 miles northeast of the airport in Cove Neck, killing 73 of the 158 people on board.  The crumpled fuselage looked a lot like a crushed soda can and came to rest only 20 feet from the back of John and Katy McEnroe's house, the parents of tennis legend John McEnroe.  What's so frustrating about the Avianca accident is it seems that the warning signs should have shone like neon beacons.  The fact is though, they didn't, they only became visible in hindsight.  I recognize these symptoms better now and after you hear the rest of the story, I bet you will too.
              I speak for expert audiences that make high-stakes decisions as a routine part of their jobs.  One day I showed an audience crash scene photos from three different perspectives of flight 52’s crash site and asked them to tell me what clues they saw as to the cause of the accident.  The air in the room thickened.  There was a fear to share ideas that might be less than perfect.  But, after a little nudging, some thoughts started to leak out.  The first person to speak pointed out that the weather had been bad.  Someone else said the plane was close to a house, while another observed that the cockpit wasn't even visible in the pictures.  If you take a second to think about these statements, you realize they are safe and superficial.  People are only acknowledging what everyone else can plainly see.  The hesitancy to share their personal thoughts created a veil over evidence hidden in plain sight.  I worked a little more to try to remove the threat, and finally, someone sees it and says, “Why isn't there a fire?”  This response is so different because it's not just an observation of what's obvious, it's also a personal insight.  This doesn’t come out until someone feels safe enough to risk being curious, instead of wanting to avoid criticism.
              The audience on this day was full of forensic experts, but not in aviation; they were financial auditors.  People ask why I would have auditors or analysts examine things outside of their specialty.  First of all; I've found the benefits of cross-disciplinary training to be remarkable.  The reason someone in aviation may say “get your head out of the cockpit,” is why all specialists should step outside the rigid structures of their industry.  It helps them see counterintuitive patterns and discover new problem-solving strategies.  When I speak for aviation experts, I start with anything except aviation before we look at how the concepts apply to their specialty.  The second reason is that understanding why flight 52 ended in a disaster has very little to do with the technical aspects of aviation.
    The problem with silence
              The case of Avianca flight 52 deserves a more detailed review.  It started out in Bogotá, Columbia and was headed up the east coast of the United States when air traffic control directed them to take a holding pat...

  • Zijn er afleveringen die ontbreken?

    Klik hier om de feed te vernieuwen.