Daily Tao – Range (David Epstein) – 6

Business school students are widely taught to believe the congruence model, that a good manager can always align every element of work into a culture where all influences are mutually reinforcing—whether toward cohesion or individualism. But cultures can actually be too internally consistent. With incongruence, “you’re building in cross-checks,” Tetlock told me. The experiments showed that an effective problem-solving culture was one that balanced standard practice—whatever it happened to be—with forces that pushed in the opposite direction. If managers were used to process conformity, encouraging individualism helped them to employ “ambidextrous thought,” and learn what worked in each situation. If they were used to improvising, encouraging a sense of loyalty and cohesion did the job. The trick was expanding the organization’s range by identifying the dominant culture and then diversifying it by pushing in the opposite direction. By the time of the Challenger launch, NASA’s “can do” culture manifested as extreme process accountability combined with collectivist social norms. Everything was congruent for conformity to the standard procedures. The process was so rigid it spurned evidence that didn’t conform to the usual rules, and so sacred that Larry Mulloy felt protected by a signed piece of paper testifying that he had followed the usual process. Dissent was valued at flight readiness reviews, but at the most important moment, the most important engineering group asked for an offline caucus where they found a way, in private, to conform. Like the one engineer said, without data, “the boss’s opinion is better than mine.” The more I spoke with Captain Lesmes, the more it seemed to me that he had felt strongly outcome accountable—searching for a solution even if it deviated from standard procedure—within an extraordinarily potent collective culture that ensured he would not make the decision to deviate easily. He had, as Patil, Tetlock, and Mellers wrote, harnessed “the power of cross-pressures in promoting flexible, ambidextrous thought.” The subtitle of that paper: “Balancing the Risks of Mindless Conformity and Reckless Deviation.” Superforecasting teams harnessed the same cultural cross-pressure. A team was judged purely by the accuracy of its members’ forecasts. But internally the Good Judgment Project incentivized collective culture. Commenting was an expectation; teammates were encouraged to vote for useful comments and recognized for process milestones, like a certain number of lifetime comments. Prior to Challenger, there was a long span when NASA culture harnessed incongruence. Gene Kranz, the flight director when Apollo 11 first landed on the moon, lived by that same mantra, the valorized process—“In God We Trust, All Others Bring Data”—but he also made a habit of seeking out opinions of technicians and engineers at every level of the hierarchy. If he heard the same hunch twice, it didn’t take data for him to interrupt the usual process and investigate. Wernher von Braun, who led the Marshall Space Flight Center’s development of the rocket that propelled the moon mission, balanced NASA’s rigid process with an informal, individualistic culture that encouraged constant dissent and cross-boundary communication. Von Braun started “Monday Notes”: every week engineers submitted a single page of notes on their salient issues. Von Braun handwrote comments in the margins, and then circulated the entire compilation. Everyone saw what other divisions were up to, and how easily problems could be raised. Monday Notes were rigorous, but informal. On a typed page of notes from two days after the moon landing in 1969, von Braun homed in on a short section in which an engineer guessed why a liquid oxygen tank unexpectedly lost pressure. The issue was already irrelevant for the moon mission, but could come up again in future flights. “Let’s pin this down as precisely as possible,” von Braun wrote. “We must know whether there’s more behind this, that calls for checks or remedies.” Like Kranz, von Braun went looking for problems, hunches, and bad news. He even rewarded those who exposed problems. After Kranz and von Braun’s time, the “All Others Bring Data” process culture remained, but the informal culture and power of individual hunches shriveled. In 1974, William Lucas took over the Marshall Space Flight Center. A NASA chief historian wrote that Lucas was a brilliant engineer but “often grew angry when he learned of problems.” Allan McDonald described him to me as a “shoot-the-messenger type guy.” Lucas transformed von Braun’s Monday Notes into a system purely for upward communication. He did not write feedback and the notes did not circulate. At one point they morphed into standardized forms that had to be filled out. Monday Notes became one more rigid formality in a process culture. “Immediately, the quality of the notes fell,” wrote another official NASA historian. Lucas retired shortly after the Challenger disaster, but the entrenched process culture persisted. NASA’s only other fatal shuttle accident, the space shuttle Columbia disintegration in 2003, was a cultural carbon copy of the Challenger. NASA clung to its usual process tools in an unusual circumstance. The Columbia disaster engendered an even stronger ill-fated congruence between process accountability and group-focused norms. Engineers grew concerned about a technical problem they did not fully understand, but they could not make a quantitative case. When they went to the Department of Defense to request high-resolution photographs of a part of the shuttle they thought was damaged, not only did NASA managers block outside assistance, but they apologized to DoD for contact outside “proper channels.” NASA administrators promised the violation of protocol would not happen again. The Columbia Accident Investigation Board concluded that NASA’s culture “emphasized chain of command, procedure, following the rules, and going by the book. While rules and procedures were essential for coordination, they had an unintended negative effect.” Once again, “allegiance to hierarchy and procedure” had ended in disaster. Again, lower ranking engineers had concerns they could not quantify; they stayed silent because “the requirement for data was stringent and inhibiting.” The management and culture aspects of the Challenger and Columbia disasters were so eerily similar that the investigation board decreed that NASA was not functioning as “a learning organization.”

I used to think that having a congruent organization and everyone pulling in 1 direction was about conformity and lack of conflict. Lately, I’ve realized that having conflicts and differences in culture can actually be a good thing. You wouldn’t want everyone in your organization to think and agree on the same things. This leads to blind spots.

Talented people tend to have their own take on things and this generally leads to diverse opinions and potential conflict. Whats important in handling this diversity is the ability to pull everyone towards the common goal.

Share