The case of Boeing: can incentives reduce risk?
In the wake of continuing safety issues with Boeing aircraft, the company announced it was changing the incentives for how people are paid in its commercial aircraft division. Now, 60% of the bonuses will be based on quality and safety metrics. Previously, 75% of the bonuses were based on financial metrics.
I’ve watched incentives and how they change how people work for four decades now. Can incentives make people conscientious, honest, and diligent?
Incentives and culture
Unless incentives and culture are aligned, there is going to be a problem.
For example, at one company I worked for, “Integrity” was explicitly highlighted as a core value. Even so, quarterly bonuses were based on a combination of sales and profit, along with performance at the team level and the individual level. The team and individual metrics varied from quarter to quarter based on what managers were trying to emphasize.
This probably sounds a lot like what is going on at Boeing. Even so, my organization did behave with a lot of integrity. How was that possible?
There were many conversations about quality and integrity happening every day throughout the organization. It was not just a plaque on the wall, it was built into everyone’s daily activities.
We explicitly made choices — at the individual level, the team level, the department level, and the corporate level — based on quality and integrity goals.
People who operated with integrity were celebrated publicly, even when their contributions were not explicitly tied to financial metrics. They were more likely to get promoted and to get significant raises and stock options — resulting in long-term success and loyalty.
People were fired for dishonest actions that showed a notable lack of integrity.
Even with all of that, there was bad behavior. At the end of the quarter, there were plenty of instances of employees lying to customers and producing poor-quality work to meet metrics. This tends to happen when people are overworked, frazzled, behind, and under the financial gun to deliver.
This bad behavior was worse in sales. I don’t believe this is because salespeople are immoral — most of the salespeople were excellent and operated with integrity, prioritizing the long-term relationship over short-term goals. But the design of sales quotas meant that you’d lose a big portion of both your and your team’s compensation if you failed to hit a number. As a result a few desperate and inexperienced salespeople resorted to unsavory tactics. By its nature, sales operates on short-term incentives, so it generates more desperation.
To be clear, quality, integrity, and honesty were the rule, not the exception. That’s just how things worked.
My takeaway from this is simple: if you want to have financial metrics as well as quality and integrity, you must integrate quality and integrity into everything you do . . . and you must be constantly vigilant.
Can you incent quality or integrity? Can you incent safety?
Are the incentive changes at Boeing going to resolve its quality issues?
Not by themselves.
I have questions.
Are they engineering products for efficiency or for safety? (I’m sure it’s both, but which is more important?)
Has the organization changed in a way that empowers people with safety concerns to change how work is done?
Do people get fired if they operate in a way that threatens quality or safety? I’m not talking about doing a post-mortem after a plane crashes and finding a scapegoat. I mean, do people get fired for operating in a risky way, even if nothing eventually goes wrong?
How often is safety mentioned in conversations between workers and managers, between managers and senior managers, and between senior managers and the C-suite? How often do these managers make changes based on those conversations?
If the new incentives are part of a cultural program like this, they will help.
If they are not, they will be irrelevant.
Moral hazard
Do you ever wonder why there are so many financial scandals, bank failures, product recalls, and other indications of the failure of corporate responsibility?
Each example is complicated. And yet, looking at it globally, it’s simple.
At each moment of decision, each employee asks themselves, is this going to be a problem? They’re getting pressure every day to perform. If they’re in senior management, that performance incentive may be worth tens of millions of dollars.
When they move too fast, skip safety checks, prioritize greed over integrity, or otherwise overlook problems, it is because they think, “This one decision I’m making is unlikely to cause a problem. Someone else will catch it.”
Senior managers who fail to maintain quality and integrity will usually still get rewarded if they deliver financial metrics. In the rare cases where something fails, they may have already moved on to the next job.
Moral hazard is the lack of incentive to guard against risk if you are protected from its consequences. And it’s rife when the things that could go wrong are unlikely, but not impossible. “We might as well take the risk,” managers think. “We’ll probably be just fine.”
No matter how you line up the incentives, it’s difficult to guard against this kind of thinking. It’s why things go wrong.
The only solution is culture. Culture is hard. But a culture of integrity will protect you far better than tweaking the mix of incentives.