Management history is easily forgotten. All it takes is a generation or two to go by and some hard economic times. Then, when problems occur, solutions are sought. The belief is that the old ideas no longer apply, and that new ones must be found.
What’s particularly surprising is just how often new ideas turn out to be resurrection or repackaging of something that was thought of long ago. The “one best way” became “best practice.” High-performance indicators (HPIs) became employee engagement, and now we have another one: It’s the idea that by orchestrating the ideas of others, we can make better decisions.
But including others in the decision-making process is not new. Far from it. It has always been encouraged. Ancient proverbs teach us that there is wisdom in a multitude of counselors. Only the foolish think that they can do it all by themselves.
In the 1920s, Alfred Sloan separated policy-making from implementation. Ten years before that, Frederick Taylor admitted that, left to their own devices, nearly everyone could decide what the most effective way was to accomplish almost anything. But he felt that in the interest of time, managers should make decisions for their employees.
Today, managers and scholars seem to miss the fact that Taylor and others were speaking to men who lacked education, and where skills were substantially less than what they are now. For much of the 20th century, labor was largely unskilled compared to what it is today; and that’s why there was so much close supervision. A comparatively small number of people understood an entire process to the extent that they could supervise the collective work of others whose jobs were limited to smaller tasks.
All that changed in the last half of the 20th century. The level of education among employees soared, as did their skill-level. Simultaneously, global competition forced organizations to shed much of their middle management. These two things taken together have meant that not only are fewer managers available to supervise others, there’s also less need for them. Workers today are capable of making many more decisions themselves – something that companies have said they’ve wanted for at least the last 20 years.
So it’s a mystery why executives think that the opinions of those who work in their companies should now, in some way, be orchestrated in order to inform decisions. Participative decision-making was formerly re-introduced to companies in the 1970s. But we’ve forgotten our history. And now we’re simply being reminded of something that we knew 3,000 years ago. Instead of inventing something new, we’ve simply discovered old furniture under the dust covers.
But we’ve forgotten something else, and that is that people are more likely to support decisions that they had a hand in making. (Note this is also a very important change management principle). And that means that, where possible, companies should place the facts into the hands of those who will be most affected by the decisions, and then let them make them.
We all resist being told what to do without being asked; but ask us what we would like the outcomes to be, and we’ll tell you. If you do that, then you’ll an added bonus. You’ll be able to count on our support. We’ve considered the alternatives, and they have been found wanting. Why should we then resist what, to us anyway, is the lesser of evils?
No matter what you call it, whether it’s orchestrating the opinions of others or just asking them for their input, inviting others to participate in the decision-making process amounts to the same thing.
Instead of worrying about the jargon, let’s just get on with it. After all, it’s what people expect.
One comment
Pingback: What’s New About Participative Decision-M...