I was recently asked to develop an article for the magazine Inside Knowledge, which was turned into a 2-part MasterClass. It is just about to be published, so I thought I would share it here as well, given that the AppGap blog looks at (amongst other things) the design of work in an interconnected era.
It’s a long-ish article, so for the purposes of this blog I have split Part I of the IK October MasterClass into two parts, the second being posted tomorrow. I will post Parts 3 and 4 when IK goes ahead with publishing Part II of the MasterClass in mid-October.
The design and management of knowledge work in perspective – from the Industrial era to the Networked Age
It took 30 years for Taylorism to finally become the standard for business management. How long will it take to replace it in the Knowledge Age?
Much has been written in the past couple of years about the impending demise of knowledge management. The majority of the analyses focus on how unwieldy, cumbersome, rigidifying and overly complex it can be (or usually is) for any given organization to address and come to grips with the core principles of ‘traditional’ knowledge management in an environment increasingly characterized by interlinked networks and continuous flows of information.
I use quotes around the word ‘traditional’ in the first paragraph because it’s useful to remember that knowledge management as a domain with codified principles, approaches and protocols is not very old. It came into being in response to organizations everywhere beginning (depending upon what you pick as a starting point) either in the late 70’s or the 80’s their respective treks through several waves of informatization (the codification, archiving and ordering of the basic information organizations use to operate, communicate and create).
Indeed, I am old enough (just) and have been working long enough (just) to feel entitled to tell a brief story, in two parts – the first part here, and the second later on in this masterclass.
Banking as a reference point
I began my adult career right out of university working in one of Canada’s large banks, mid-way through the year 1976. I was one of the first of a wave of university graduates the banks had begun hiring, presumably because they had begun to see and believe that the North American economies and markets were beginning to move into what was then called the Information Age.
Prior to this shift in personnel strategy (the term human resources wasn’t around then, it appeared in the early to mid 80s) the banks hired high school graduates, mainly because banking work was pretty unchanging, based on standardized principles and practices such that the differentiating factors for an illustrious career depended upon longevity while acquiring experience, intuition and being steeped in the practices and culture of the bank.
With one exception (the bank I then worked for), the CEO’s and presidents of the Canadian banks were men who had started out around the end of World War II working as tellers in small branches, often in the hinterlands of Canada. Going to university was relatively rare then, commerce curricula and degrees even rarer and I believe (though I may be wrong) that the designation MBA did not exist in 1970. (The first Executive Masters in Business Administration (EMBA) was first offered at the Wharton School of Business in 1940).
Today, of course, banks are one of the most important users of computers, data bases, algorithm-and-complexity-science-based programs of all sorts, and the Web – for interactive interconnections with clients via automated online banking, vast networks of automatic teller machines and increasingly collaborative platforms inside the firewall. But in the early 70’s, customers came into the branch with their passbooks, tellers wrote the deposits and withdrawals into those passbooks with ink and initialed beside the entries, and balanced at the end of the day using a ledger card and a big-iron adding machine. I know this is so … I was there and as a management trainee had to learn the difference between a debit and a credit the (unforgettable) hard way.
I’ll come back to the second part of this story a bit later on.
Resistance but adoption
In my opinion, as organizations have moved along an evolutionary path similar to the one described above, knowledge management as a means of enhancing and making more effective the use of information flows, extant domain, market or industry knowledge and human talent has suffered from the same general obstacles and challenges as the fields of (amongst others) organizational effectiveness, learning and innovation. Regardless of its compelling conceptual value, a disciplined and sustained approach to knowledge management exists. And it has been adopted at the mercy of the particular philosophy of a set of organizational leaders, organizational politics, the prioritization amongst a range of competing objectives and the sustainable access to and availability of resources.
Notwithstanding these obstacles, I submit that creating and managing pertinent and useful knowledge is more important and of greater priority today than ever. For more organizations today (and tomorrow) in an increasingly wider range of industries, countries and markets, useful and pertinent just-in-time information and knowledge are key factors impacting the organization’s ability to operate, develop products and services consumers and clients want and purchase, create economic value and adapt to continuously changing conditions.
Escaping the industrial mindset
However, there’s another very large obstacle in the way, I believe. Organizations of any size and scope in the year 2008 still, by and large, use the assumptions about efficiency, division of labour and accountability that were developed in the first half of the 20th century, when those assumptions began to be codified into management science … standardized methods for organizing and managing work and productivity.
Taylorism and the principles of scientific management changed a lot about the nature of work in North American and western Europe pretty quickly, all things told … but it still took thirty or forty years to emerge into its relatively full-blown effects. At its heyday, the manufacturing might and effectiveness of the United States that Taylorism helped create enabled it (along with important agricultural and resources capabilities and growing financial clout) to become the world power economically over several decades at most.
In an important sense, it was useful to his theories that 1) they helped respond to the massive spread of the Industrial Era’s requirements for growth in the first half of the 20th century, and 2) World Wars I and II came along in the late 1910′s and in the late 1930′s to provide a massive need for manufacturing.
Thirty plus years elapsed from the publication of Principles of Scientific Management in 1911 to the codification of those principles into work design methodologies in the 1940′s and early 1950′s. Taylor and his theories get a bad rap today, but it is clear that they were highly useful to the process of creating wealth by improving manufacturing processes and capabilities.
It seems banal to say that those theories are less effective today, but I am not sure that’s the case. There have been no comprehensive theories and principles come along (yet) to replace Taylorism. Notwithstanding a plethora of management books published since the mid-1980′s promising enhance organizational effectiveness … more often than not by combining Taylorist principles with developmental workarounds and adaptations.
And yet … the recent emergence of the field called Enterprise 2.0, and clarion calls for management innovation that have followed (see Gary Hamel, Andrew McAfee, Tom Davenport, Don Tapscott, Dave Snowden and many, many others) promises much potential disruption. It also portends significant struggle as the forces of buttoned-and-battened-down efficiency derived from a manufacturing-focused era vie with the forces arising from networked flows of information in an era where economic value is derived from the construction and application of knowledge in networks to product and service design and delivery (manufacturing happens in China now).
So, I am thinking about the future of work as it pertains to the principles for designing networked knowledge work.
Legacy of Taylorism
The following is compiled from Wikipedia:
Taylor published his Principles of Scientific Management in 1911, which elucidated four core principles:
Replace rule-of-thumb work methods with methods based on a scientific study of the tasks.
- Scientifically select, train, and develop each employee rather than passively leaving them to train themselves.
- Provide “Detailed instruction and supervision of each worker in the performance of that worker’s discrete task”.
- Divide work nearly equally between managers and workers, so that the managers apply scientific management principles to planning the work and the workers actually perform the tasks
Taylor thought that by analysing work, the “One Best Way” to do it would be found. He is most remembered for developing the time and motion study. He would break a job into its component parts and measure each to the hundredth of a minute.
He was generally unsuccessful in getting his concepts applied and was dismissed from Bethlehem Steel. It was largely through the efforts of his disciples (most notably H.L. Gantt) that industry came to implement his ideas.
Managers and workers
Taylor had very precise ideas about how to introduce his system:
“It is only through enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation that this faster work can be assured. And the duty of enforcing the adoption of standards and enforcing this cooperation rests with management alone.”
Workers were supposed to be incapable of understanding what they were doing. According to Taylor this was true even for rather simple tasks.
“I can say, without the slightest hesitation”, Taylor told a congressional committee, “that the science of handling pig-iron is so great that the man who is … physically able to handle pig-iron and is sufficiently phlegmatic and stupid to choose this for his occupation is rarely able to comprehend.”
Scope of Taylor’s Influence in the United States
- Carl Barth helped Taylor to develop speed-and-feed-calculating slide rules to a previously unknown level of usefulness. Similar aids are still used in machine shops today. Barth became an early consultant on scientific management and later taught at Harvard.
- H. L. Gantt developed the Gantt chart, a visual aid for scheduling tasks and displaying the flow of work.
- Harrington Emerson introduced scientific management to the railroad industry, and proposed the dichotomy of staff versus line employees, with the former advising the latter.
- Morris Cooke adapted scientific management to educational and municipal organizations.
- Hugo Münsterberg created industrial psychology.
- Lillian Gilbreth introduced psychology to management studies.
- Frank Gilbreth (husband of Lillian) discovered scientific management while working in the construction industry, eventually developing motion studies independently of Taylor. These logically complemented Taylor’s time studies, as time and motion are two sides of the efficiency improvement coin. The two fields eventually became time and motion study.
- Harvard University, one of the first American universities to offer a graduate degree in business management in 1908, based its first-year curriculum on Taylor’s scientific management.
- Harlow S. Person, as dean of Dartmouth’s Amos Tuck School of Administration and Finance, promoted the teaching of scientific management.
- James O. McKinsey, professor of accounting at the University of Chicago and founder of the consulting firm bearing his name, advocated budgets as a means of assuring accountability and of measuring performance.
Part II – Coming Tomorrow
Social computing drives change …
Powered by Qumana