Language selection

TSB Transportation Safety Summit

 View document in PDF [2.65 MB]

Getting information flow

Ron Westrum
Professor of Sociology, Eastern Michigan University
Ottawa, Ontario, 21 April, 2016

Check against delivery.

Slide 2: What is “information flow?”

Information flow is getting information from the people in the organization who have it to the people who need it.

Or

The more effectively the information in the organization is used, the better the information flow.

Slide 3: The marks of good information flow (IF)

IF responds to the receiver's need for information

IF is timely

IF arrives in a form that can be easily digested

Its “bill of lading” is intact

Slide 4: What happens when these features are not present?

For instance, what happens when someone needs to know something, but they don't find out?

This happened with technicians building the Hubble telescope

Slide 5: Problem # 1

People don't speak up

Photo of person raising hand to speak

Slide 6: A new orbital telescope, the Hubble, can't get a clear image…

Photos of satellite in space and galaxies

Slide 7: Because the technicians had made changes they had not cleared with the engineers…

Photo of technicians working

 Slide 8: It took about US $3 billion to fix the problem

Photo of astronauts working in space

Slide 9: Sometimes a problem is identified, but information about it is not passed on.

This happened with the American use of the ATR-72 aircraft, once of which fell out of the sky in Indiana in 1994, due to an instability the manufacturer knew about, but decided to keep out of an accident report.

Slide 10: The problem had appeared with the crash of an ATR-42 crash on Mt. Crezzo. But the cause was not made clear.

Photo of a newspaper headline

Slide 11: So  when an ATR-72 encountered the same problem, it went down in, Roselawn, Indiana, October 1994…

Photo of grieving couple at cemetary

Slide 12: A helicopter model needs to arrive with a proper “bill of lading”

Old bill of lading with handwriting

Slide 13: But sometimes it doesn't…

Photo of a Chinook helicopter

 Slide 14: And problems ensue… A Chinook crashes on the Isle of Mull 1994

Photo of helicopter crash

Slide 15: So, the pilots were at fault?

For instance, what happens when someone needs to know something, but they don't find out?

This happened with technicians building the Hubble telescope

Slide 16: Another instance of this problem took place when there was, …a “dress rehearsal for an accident”---a near-miss Iraq 1992

Photos of helicopters and fighter jets

Slide 17: The problem was revealed one night in a bar shared by the US Army and the US Air Force

Photo of a crowded bar

Slide 18: But no one picked up the telephone to report it---whose job was it, anyway?

Photos of soldiers in a conversation and a telephone

Slide 19: So, the next time it happened, the close encounter was fatal—Northern Iraq, 1994

Photos of helicopters and fighter jets

Slide 20: This time, there was no near-miss. Instead two Blackhawks were downed and 26 people died

Photos of crash and soldier carrying a coffin.

Slide 21: So what are the structural features that prevent proper flow?

Diagram of information roadblock between sender and receiver

Slide 22: Sometimes it takes courage to report a problem—Col. Jack Broughton

Photo of Colonel and an ejection seat

Slide 23: Jack Broughton and the F-106 ejection seat

Broughton was commander of a squadron when he had to confront a problem with the ejection seat of the Delta Dart fighter aircraft. After the seat had killed 13 pilots, Broughton wanted to ground the plane, but had to convince a three-star general.

Putting his job on the line, Broughton insisted he would not fly the planes unless the general would take personal responsibility for the next pilot death. The general gave in and the seat got changed.

Slide 24: Hierarchy and differences in rank can interfere

Photo of someone communicating in sign language

Slide 25: Bosses don't always help…

Photo of someone looking down

Slide 26: In fact they can be a serious problem…

Photo of a boss yelling at colleagues

Slide 27: And then we have organizational conflict…

Drawing of war scene

Slide 28: The negative factors are shaped by the organization's culture

Slide 29: Pathological cultures focus on the needs of the chief

Photo of a man dressed in a suit.

Slide 30: And a lot of energy is spent on conflict….

Photo of colleagues in a meeting.

Slide 31: Whereas bureaucratic cultures focus on the needs of particular departments

Photo of a business meeting

Slide 32: Yet departmental silos can impede the flow of information

Photos of silos

Slide 33: In a generative culture, Information flows because people believe they belong to a common enterprise

People having a discussion

Slide 34: People work together to get the job done

Photo of three men holding a missile.

Slide 35: Problem #2 Unseen Issues

Slide 36: Hidden profiles

At Miami University in Ohio, a social psychologist named Gerold Stasser did a number of experiments on what he called “hidden profiles” in group decision-making.

Slide 37: Stasser's “hidden profiles”involved people who had both shared and unshared information

Geometric shapes in various colors scattered

Slide 38: Group discussion, though, focused on the shared information, not the unshared

Geometric shapes in various colors lined up.

Slide 39: So participants explored the common ground…

Rectangle with various shapes attached.

Slide 40: But tended to leave the unshared unspoken…

Rectangle with various shapes attached.

Slide 41: This can happen in organizations as well…

Organizations are more likely to focus on the well-known, the normal, and the good news.

They tend to leave aside the unusual, the abnormal, and the problematic.

Slide 42: So when something out of the ordinary happens…

People hesitate to report

Photo of man with his hand in front of him

Slide 43: The British at the South Pole were first to spot the “ozone hole” over the Antarctic

Infrared picture of the earth

Slide 44: Their instruments told them the ozone was getting thinner.

A Dobson spectrometer

Photo of a Dobson spectrometer

Slide 45: But the Americans' Nimbus 7 satellite didn't report any such effect…They thought.

The Nimbus 7 had a TOMS sensor, specially designed the measure the thickness of the ozone layer. It seemed to show nothing.

Photo of the Nimbus 7 satelitte

Slide 46: The British Antarctic team didn't want to look stupid, so they sat on their data for 3 years

Photos of scientists

Slide 47: But it turns out the satellite was OK!

What was wrong was not the Satellite, rather that the ground-based computer had edited the data out. There was a built-in blind spot.

Photo of a man covering his eyes

Slide 48: When the British finally spoke up, the Americans rechecked, and discovered what they had missed

Chart showing infrared photos of the earth over a 140 year period

Slide 49: And then we have the Fukushima disaster

Photo of the Fukushima disaster

Slide 50: The Japanese committee system was flawed, and often ignored the exceptional opinion

Photo of Japanese men in a meeting

Slide 51: Such as that of Kunihiko Shimazaki, an expert who warned that the earthquake could trigger a tsunami that might be double the height of the seawall built to
protect the coast

Photo of Japanese man speaking at a podium

Slide 52: But Shimazaki's opinions were largely ignored

Slide 53: In the event, the tsunami in fact exceeded expectations…

Photo of a tsunami wave and the debris left behind.

Slide 54: Professor James Reason, years ago, warned of the dangers of “latent pathogens” for potential accidents

Diagram of latent pathogens for potential accidents

Photo of Professor James Reason

Slide 55: But then, who would see the latent pathogen?

Photo of someone peeking from behind blinds

Slide 56: How can we be sure we hear the “faint signals?”

The issue was raised one day by the head of a nuclear power plant, and I had to admit that I couldn't tell him.

But I thought about it.

Slide 57: This is what I think now.

To report faint signals, the labor force must feel three things:

  1. They must feel aligned with the organization's management.
  2. They must feel they have (or can get) the key expert knowledge to determine if something is wrong.
  3. They must feel empowered to speak up.

Slide 58: Enrico Fermi, a nuclear physicist defines the “will to think” about a problem.

“The will to think is the belief that something will happen because of your thought. I was willing to think about nuclear power because the American government was willing to put forth the effort to try what I envisioned. That gave me the will to think.”

Enrico Fermi was the prime intelligence behind the creation of the world's first nuclear power plant.

So how do we create “the will to think?”

Slide 59: Problem #3  Disjunction between management and operating level

“I just sent them up, who knows where they come down, That's not my department! says Wernher von Braun.” from a song by Tom Lehrer

Slide 60: A study about lying in the U.S. Army

According to a field study carried out by professors at the Army War College, systemic forces in the Army encourage lying. For instance, it happens that training requirements often exceed the number of hours available for such training.

Since commanders cannot do that much training, and since they cannot change the requirements, they are forced to “prioritize” and cherry pick which training requirements they will do, and lie about the rest.

Slide 61: The paradox of “smart knowledge”

This is a very scary situation, because:

Slide 62: And worst of all

The unit commanders are in a stupid situation, which they can't change, and which encourages lying, deceit, and dangerous improvisation. It undermines morale and the military ethic.

And about this the top commanders have no clue.

Slide 63: In Britain

When Jimmy Savile, top TV celebrity and close to the royal family, died, a beautiful tombstone was made for him. But unknown to the public, Savile was also a serial rapist and abuser. Within weeks, evidence of his widespread crimes suddenly exploded in the media. His tombstone was demolished.

How had he gotten away with groping, abusing, and raping hundreds of women and children, right under everyone's noses?

Slide 64: They commissioned reports to find out

At the National Health Service, the report stated that often the lower levels in the hospitals knew about Savile's crimes, but this information never reached management.

At the BBC, they found that Savile's outrageous and often criminal behavior was well known at the studio level, but top management continued to operate on false assumptions and stereotypes, as well as the benefits of Savile's popularity. Savile continued along.

Slide 65: The reality

The reality is that no one at the top wanted to hear the bad news. After Savile had raped an 8-year old patient in a hospital ward, the nurse told her not to cause trouble by reporting the crime. She didn't.

If you don't want to hear it, you probably won't.

But then you won't know what you are not hearing.

Slide 66: Requisite imagination

Often management is too busy with its immediate concerns to inquire into things at the lower levels.

Management needs a “requisite imagination” to imagine how and why things might go wrong, and then probe deeply along potential fault lines.

But usually they don't ask.

Slide 67: Conclusion:  So what can we do about such things?

To get better information flow, we need to do the things that go along with better reporting.

Slide 68: First, a generative culture

Slide 69: Information flows when barriers come down

Slide 70: Second, a workplace focused on cooperation

Slide 71: A common vision and friendly feeling help

Photo of people in a discussion

Slide 72: A different emphasis?

Photos of a castle and a soccer team