The Associated Press
news release, “Worker productivity shoots up 8.6%…” in The Daily Courier (May
8) celebrated the recent spurt in worker productivity. It claimed that “gains
in productivity allow companies to pay workers more without raising prices,
which would eat up those wage gains.”
That’s the theory, anyway, and it should be true.
Unfortunately, there’s a darker and more recent side to this story. People seem
to forget that in the 1960s and ’70s, corporate executives and consultants
like me – as well as “futurist” economists – were telling employees that they
should cooperate with their companies’ quality and productivity programs,
because improvements would benefit everyone.
When skeptical employees complained, “But if we improve our productivity and
are able to produce more with fewer people, the company may decide to reduce
headcount.” Our standard response was, “Of course not. Improved productivity,
technological breakthroughs, and better working procedures will eventually
mean higher wages, even a 30- or 35-hour work week. Two months vacation a
year. Better medical coverage. Corporate sponsored educational
opportunities.” And so on.
In fact, things had been going that way in the ’50s and into the ’60s and
’70s. Some colleges even offered a “recreation” major for those who expected
to work for progressive corporations that had to find things for employees to
do in their off hours.
Management consultants developed “quality of work life” projects to improve
work satisfaction and morale, as well as productivity. Companies were to
prize their employees as solid partners in the success of the enterprise.
Productivity spurted, wages went up significantly, and except for the
inflation that the Vietnam War caused, the economy was in fairly good shape.
That was before the 1980s when American investors discovered leveraged and
hostile buyouts, globalization, use of temporary workers with no benefits,
and a host of other creative ways to reduce headcount and to put a lid on
wage increases. Companies forgot promises to workers. Working-class
employees, who put their creative efforts into improving technology and work
procedures (e.g., quality circles, LIFO programs, productivity improvement
projects), found to their dismay that corporations could apply their ideas
and suggestions in Guatemala and China as well as here, and at a
significantly reduced labor cost.
Not only that, those who lost their jobs, especially the low-skilled but
well-paid manufacturing jobs, entered the labor market and depressed the
wages of everyone else who still had jobs that the companies couldn’t export
to Third World countries.
In other words, the people who built America’s great corporations
– with their creative as well as physical contributions – were sold out by a
new class of impersonal investors who consider employees as simply an expense
to minimize, nothing more. That’s why wages have stagnated for over 20 years,
relative to inflation.
Please note, if you don’t consider yourself as “working-class,” watch out.
Recent news events have demonstrated that investors, through their corporate
executives, have discovered that companies can replace people who hold a
doctorate in physics with foreign workers who have a doctorate, and pay them
one-third as much.
The same is true of computer programmers, engineers, and virtually any kind
of professional whose work can be done in other countries, or whom they can
replace with “temporary visitors.” Whereas we used to refer to “management
and workers,” “professionals and workers,” or “exempt and non-exempt,” it may
be more proper today to refer to “investors and everyone else who actually
does the work.”
So, how do companies get this improved productivity from today’s employees?
Technological improvements certainly. People working smarter, of course. But
also, people working much harder, under reduced headcount, and with the
constant threat of losing their jobs in an atmosphere of rising unemployment.
(Charles M. Kelly is an author and a retired management consultant living
in Prescott. Contact him at