Tag Archives: etymology

A Third Note on The First CEO

In a comment on one of my posts about the rise of the acronym “CEO,” a reader named Hugo reports some early Australian illustrations. I thought I’d lift Hugo’s notes from the comments and share them here, because the examples he’s found all pre-date the 1970 illustration of the acronym from the Harvard Business Review, which up until now I had taken to be the earliest. One dates back to 1914.

Time, again, to notify the dictionaries.

I found some earlier 1968 and 1950 examples in Australian newspapers, where chief executive officers were found at hospitals. I also found a 1917 [sic, but the source is from 1914] from a story about a town hall.

The Canberra Times, 27 July 1968, page 22:
[Begin]
Applications are invited for the above positions at the Hillston District Hospital.

Applications and enquiries to the undersigned or Matron Fairchild, Box 1, PO, Hillson, NSW, 2675.
R. I. Cross,
C.E.O.
[End]

The Sydney Morning Herald, 29 March 1950, page 30:
[Begin]
PARRAMATTA DISTRICT HOSPITAL.
Wanted. Experienced Sister to take
charge of the Out Patient Department
at this hospital.

N. B. FILBY,
Secretary and C.E.O.
[End]

Independent, 7 November 1914, page 3:
[Begin]
BEHIND THE SCENES
BY A TOWN HALL FLY

Of course I am the chief executive officer but I only execute by instructions.

“What a pity,” said the M.M., the C.E.O.

“Not at all, my dear young lady.” the C.E.O.’s voice was tear laden too.
[End]

Also uses G.H.U. a few times for Great High Understrapper.

I don’t think these earlier Australian instances should invalidate what I’ve said previously about the widespread use of the acronym CEO in the 1970s and 1980s. Those observations concern the use of “CEO” as an important marker of corporate power, social status and cultural celebrity in America, from roughly 1970-2010.

Still, it’s interesting to consider these early examples. The first two are abbreviations used in newspaper advertisements (maybe just to save money) for positions at hospitals, where the CEOs are clearly in charge of correspondence if not of hiring. Nothing too glamorous. [Update: And one reader, in a comment on this post, suggests that CEO in this context may mean “Catholic Education Officer,” adding that at this time in Australia, “nurses and religious orders go together.”]

The illustration from 1914 offers a satirical, behind-the-scenes account of a municipal office thrown into bureaucratic confusion by a report of 24 cows eating all the flowers and shrubs in the park. Underlings and citizens address the Chief Executive Officer by such honorifics as “Your Chief Executiveness” and “Most Magnificent” and, then, “CEO.” It is an empty title; he seems unable to execute anything at all: “Of course I am the chief executive officer,” he insists, “but I only execute by instructions.” When he finally understands the gravity of the situation, he acts: “I will tell somebody to tell somebody else to tell the inspector as soon as he comes in the morning at nine. I’m sure 24 cows won’t eat all the shrubs in that time.” He is very much the Chief, very much an Officer, but not much when it comes to Execution.

A Second Note on The First CEO: the CEO As Agent of Historical Change

Susy Jackson, an editor at Harvard Business Review, emailed me last week to tell me that she and her colleagues had discovered an illustration of the acronym “CEO” that predates the early instances discussed in my previous post on this subject.  Time to update that post and, while we’re at it, the entry on CEO in the Oxford English Dictionary. (I’ve emailed them to let them know).

A search through the HBR archives (one of Jackson’s colleagues described it as “not really very scientific, but fun”) turned up an article in the May June-1970 issue of HBR by Joseph O Eastlack, Jr. and and Phillip R. McDonald entitled “The Role of the CEO in Corporate Growth.” As we might expect, the article takes care to spell out and abbreviate the term in its first use: “chief executive officer (CEO)”; the speculation is that this was “standard treatment for a term that was thought to be known to HBR readers, but not so familiar that they could dispense with spelling it out altogether.” In 1970, after all, the CEO had just arrived on the scene.

A few thoughts about that entrance.

In my previous post I speculated that the term CEO may have come into wider use at HBR under the editorial direction of Ralph Lewis, who was appointed editor in chief in 1971, and oversaw several changes in editorial direction. This 1970 illustration of CEO predates that appointment; Edward Bursk was the editor in chief of HBR in 1970. Still, there’s no doubt HBR under Lewis’ direction helped define and disseminate the term.

Whether this more frequent recourse to the acronym in the pages of HBR was the result of Lewis’ policy or just a sign of the currency the acronym was gaining in management and governance discourse is hard to say. But it’s pretty clear that the wide acceptance of the acronym in the 1970s marks a shift – not just in editorial convention, but also in ideas about governance, leadership and power, within and without the corporation. By the mid to late 1970s, CEO is well on its way to becoming not just a convenient tag but an important construct of corporate power, social status and (by the 1980s) cultural celebrity.

The temptation to start painting on a broader canvas is almost irresistible. After all, big things are happening in the early 1970s, in business, in American society, around the world. When the figure of the CEO emerges in the 1970s, the heyday of the man in the gray flannel suit has reached its nadir. In America and throughout the industrialized West, the postwar boom – which witnessed the rise of the managerial class – has yielded to a grim post-industrial reality.

Indeed, the CEO will be one of the defining figures of the period that runs from roughly 1970 to 2010, the post-industrial period. In response to falling profit rates in manufacturing, we see during this period “a shift from productive enterprise to financial manipulation” (as Chomsky, summarizing economic historian Robert Bremmer, recently put it); I think it’s no coincidence that with the arrival of the CEO on the scene, the “financialization” of the economy has begun. (I understand the word is controversial; but let it stand for now: these are just broad strokes.)

The CEO emerges from this shift. He is its creature and creator – an agent entrusted with its execution – and the period of the CEO’s glory extends from the triumph of neo-liberalism during the Reagan-Thatcher era all the way to the financial crisis of 2008 and the institutional failures and social collapse it precipitates.

The First CEO

For some time now, I have been wondering when and how the acronym “CEO” came into general use. This isn’t just a matter of idle etymological interest. CEO is one of those rare acronyms – like scuba, radar, and snafu – that have become words. And in the course of becoming a word, CEO has redefined our world.

I was intrigued by the entry in Webster’s Dictionary that seemed to pinpoint the date: 1975. Only Webster’s didn’t provide a citation or attestation. So I wrote to the publisher at the beginning of March to ask where this first CEO might be found. A mere two weeks later, a reply came from Joanne M. Despres, Etymology Editor at Merriam-Webster. She informed me that Webster’s researchers had found that first illustration of CEO in a British publication, Neville Osmond’s Handbook for Managers, volume 2 (London, 1975).

But it turns out they had not dug deep enough: “In reviewing the standard sources we use to research dates,” Despres wrote, “I noticed that the Oxford English Dictionary now reports pre-1975 evidence of the word’s existence.” The 2011 online edition of the OED reaches back across the Atlantic, to America, and a little further back in time, a few years earlier, to the March-April 1972 issue of the Harvard Business Review: there we discover “a technician in his early forties who joined the company three years ago as president but not CEO.” (In light of this new evidence, Despres has requested that Webster’s “date for CEO be revised at the first opportunity.”)

I hoped to find but I didn’t find an even earlier illustration yesterday, when I went to the New York Public Library to track down Despres’ OED reference and review past editions of the Harvard Business Review on microfilm. I still have a number of leads to follow. But in the course of my reading it became tolerably clear that someone at the Harvard Business Review made an editorial decision in late 1971 or early 1972 to start using – or allowing the use of — the acronym CEO. This was right around the time Ralph F. Lewis was named editor of the Review (in 1971). Lewis instituted a number of important changes at the Review; this fateful concession to shorthand may have been one of the more minor changes he made, but it had immediate consequences.

Once the term is allowed into the Review, it begins to populate the pages of the journal. There is no turning back. Along with the instance cited by the OED editors, there are a number of early illustrations of CEO in the Review of 1972. This one appears in Myles L. Mace’s article on “The President and the Board of Directors”: “I use the title ‘president’ to mean the chief executive officer, recognizing that in some corporations the CEO may have the title ‘chairman of the board.’” (Mace’s earlier articles for the Review, in 1965 and 1966, use “chief operating executive,” “chief executive,” and “president,” but not CEO. His Directors: Myth and Reality, published in 1971, adheres to the same long form usage.) We find the newfangled acronym, again, in “Conflict at the Summit: A Deadly Game” by Alonzo McDonald. Here, McDonald takes some care in introducing it:

Leaders are still consumed with the problem of how to organize the summit. Inevitably, it is the first topic that a newly appointed chief executive officer (CEO) wants to discuss with his most trusted counselors and confidants.

And then he can use it freely:

Many CEOs who sincerely see themselves in the role of moral leaders are perceived by others as confirmed and passionate addicts of power.

The point is not that the Harvard Business Review foisted the term CEO on us. It had most likely been in use, in the MBA classroom and in the corporate boardroom, for some time. The Review certainly helped disseminate the acronym; and it’s worth remembering that readers, subscribers and contributors were then, as now, influential, powerful and connected to other influential and powerful people. McDonald, for instance, would be named Managing Director at McKinsey in 1973. Lewis came to the Review from accounting firm Arthur Young and was “director of several prominent corporations”; at the time of his death in 1979, he sat on the boards of Houghton Mifflin, Twentieth- Century Film Corporation, and Paine, Webber, among others. Mace was one of the leading lights of Harvard Business School and served, as well, on a number of boards.

Mace’s work on the role of directors (in Myth and Reality) was especially influential and timely. There was then, as now, an urgent need for new bearings – a new orientation; and the sense that it is time to dispense with institutionalized illusions and find new direction goes well beyond issues of corporate governance. New, big, disturbing questions about the role of business in society, the counter-culture and the emerging global economic order are coming to a head. It’s not without significance that it’s at this moment – at the dawn of late twentieth-century neoliberalism — that CEO makes its first appearance.

It is only a matter of a decade or so before the word is regularly in the newspapers, on the TV, and on everyone’s lips, and the CEO has become what he is today: a cultural icon, celebrated and hated, creator and destroyer, a symbol of American success or the villain behind America’s current woes.

UPDATE: For a slightly earlier (1970) illustration of the acronym and some further discussion, see this post.

Rick Santorum, Etymologist

Words are not Rick Santorum’s friends. The Republican presidential candidate has the distinction of having had his own name turned against him. He pleaded — to no avail – with Google to cleanse the Internet of the “filth” associated with santorum.

Now Mr. Santorum has turned to etymology. In the most recent Republican debate, he argued that the real trouble with the economy is the breakdown of the American family. For advancing this view, he was predictably pilloried by the left and praised by the right. Both sides, however, seem to have given him a pass on one argument he advanced.

“The word ‘home’ in Greek is the basis of the word ‘economy.’ It is the foundation of our country,” Santorum said. “You can’t have limited government, you can’t have a limited government, if the family breaks down.

Technically, Santorum is correct: we derive our English word “economy” from oikos (household) and nomos (law); economy involves the ordering or dispensation of the household.

But the ancient Greek household – with its patriarchal order, its separate and unequal quarters and roles for men and women, and its slaves, who did the household chores and, on larger estates, worked the fields– is not the happy suburban home Santorum would like to associate himself with in his campaign for the presidency.

Who knows? Maybe there is a patriarchal, pro-slavery, plantation-owning constituency out there, waiting for someone to take a stand on its behalf.

So maybe all is not lost. Santorum can probably find fodder for his family-values argument in the observation that Greek lawmakers took an interest in promoting marriage, the main object of which was perpetuation of the oikos through child-bearing and child-rearing. That regressive view of the household and of women’s place in the world might not win him the women’s vote; but he wasn’t going to win that anyway.

Another Postscript on Innovation- Where I’m Going With This "Orientation" Thing

In my last couple of posts I started to make a case for what I admitted might seem like a far-fetched idea: that research into the human condition and the social world could be as deserving of credit and support as scientific and technical research, especially if the goal of supporting “research” with the R & D tax credit is to deliver “public benefits.”

At the very least, non-scientific modes of inquiry – the study of people and society, languages and culture — deserve more credit than currently given (which is, when it comes to the definition of “research” in the R & D tax code, none), because, I suggested, they provide critical balance to innovation, the very thing R & D is supposed to spur. They provide orientation.

I want to talk a little more about the work I want that word to do. I used orientation just to rough out an idea at first, but I’ve come to like it, not in spite of but because of its association with geography, maps, directions, coordinates and a sense of place. Orientation, in the sense I’m using it, is like having an internal compass — a deep sense of where you are, where you ought to go, and the best way to get there.

To take this a little further, orientation requires and stems from a profound sense of place, of the here and now, in all its complexity and connectedness to other places and to what has come before and what is likely to come after. Knowing where you really are is not just local knowledge; it’s knowledge of how you are situated, connected and not connected, where there are continuities and where you can expect discontinuities. For decision-makers, that contextual knowledge is critical to planning and strategy as well as business judgment (and therefore good governance).

Why? Because orientation helps you appreciate and respect limits, providing a much-needed sense of human scale, without which you cannot make innovation meaningful or growth sustainable. Innovation is the spur, orientation, the reins. A good rider needs both. The events of the past few years should make that tolerably clear.

Or, to use the shorthand I’ve been using since my last post: innovation produces wares; orientation creates awareness. I’m not entirely sure of this formulation, because the play on words here disguises as much if not more than it reveals. Wares can take the form of software, hardware, housewares, or other goods and services; I heard someone the other day use the barbarism “thoughtware.” Our word ware comes from an Old English word meaning “goods” – waru. Awareness, on the other hand, would seem to have nothing to do with commodity exchange. We think of it almost as a synonym for consciousness. It derives from the same root as our word guard; to be aware is to keep watch.

But tellingly both words ultimately derive from the same Indo-European root: wer. This particular “wer cluster”

has to do with watching, seeing, and guarding, but the sense of direction is often there—as in guarding (warding) or looking in a certain direction. From this root we get aware and wary, ward (from weard, keeper) and warden, as well as award and reward and wares (things that are guarded or watched).

It’s a good question whether wares need watching because they are valuable or are made valuable by being watched. Likely both, in some measure. Wares – the products of innovation — are the goods awareness watches and keeps, holds and esteems, prizes and guards, the things entrusted to its direction.

Mr. Efficiency

The business guru Jim Collins has a stopwatch – an impressive, digital stopwatch, judging from the picture of it in the New York Times.

His stopwatch keeps three separate times, a running tally of time spent on pursuits he labels “creative,” “teaching,” and “other.” Collins tabulates the readings of the stopwatch on a spreadsheet; then he posts the results on a whiteboard in his Boulder, Colorado office. His aim, he tells the Times, is to keep the creative pursuits (writing and exploring ideas) at or above fifty percent of his time, and to divide the rest of his time between his teaching duties at the University of Colorado and the managing of his small enterprise – which supports all the things Jim Collins does: writing business books about why companies succeed and fail, giving talks, and consulting.

Bravo, I would like to say. I know the vigilance required to keep other obligations from impinging on one’s creative work, and though I am not teaching right now, I aspire to a balance much like the one Collins has achieved. But then there’s that stopwatch, and the spreadsheets (Collins even logs his hours of sleep: he needs 70 to 75 hours every ten days), and I have to wonder just what sort of guru Jim Collins really is – or what religion he’s out to spread.

Adam Bryant, who wrote the profile for the Times, calls it “doggedness.” Collins takes an “exacting approach to time management and research,” Bryant writes, and lives according to a “method” he “borrows from other hypersuccessful people. He approaches every aspect of his life with purpose and intensity.” That’s certainly one way of putting it. But it misses an important point, and misses why I can’t bring myself to applaud or approve.

Bryant’s portrait of Collins is a study in what I would call ethical Taylorism. I think the coinage is sound and the label applies. Taylor, of course, is F. W. Taylor, the great grandfather of “scientific management” and management consulting. Peter Drucker, the guru’s guru, described Taylor as “the Isaac Newton (or perhaps the Archimedes) of the science of work.” In the time studies for which he’s best known, Taylor analyzed a bit of industrial work and broke down the actions required to perform it into hundredths of a second to look for more efficient ways to perform the action. He thought there could be a “science of handling pig-iron” and a science of shoveling (and, incidentally, that the pig-iron worker or the day laborer was too stupid to figure out the “one best way” to perform his appointed task).

Collins has turned his whole life into a Time Study. He has made a habit of efficiency – habit here in the Aristotelian sense of an ethical habit, a disposition or hexis. It is only fitting, I suppose, that this creature of scientific management should devote the “creative” work he so jealously guards from other obligations to questions of management theory — those are less likely than others to lead him to other obligations — but the real point here is a simple one: ethical Taylorism makes a virtue of efficiency. Or, to put it another way, it mistakes efficiency for virtue. (In this light, I have to wonder how ethical Taylorism might have played into the financial crisis, or how it might play into the impending business failure of the New York Times.)

The most popular expression of ethical Taylorism is probably Covey’s Seven Habits of Highly Effective People, a book nearly everyone professes to have read but hasn’t, because books like this are ultimately unreadable — and not ever meant to be read (because that would be a waste of time). It, too, is a celebration of the life efficiently lived, of effectiveness; the two Taylorist terms share an etymological root in the Latin efficere, and a philosophical confusion of human being with an efficient cause.

Or they both rely on an unphilosophical reduction: in this conception, human work is merely a means to an end, so it should be made as time-efficient as possible, and human beings are agents — no, merely agents who bring about an end, rather than ends in and of themselves. There is not much room here for human dignity, or true vocation, or even a sense of creative work as discovery and self-discovery. (The creative is harnessed to a regime of production.) I might go so far as to say that there is not much room here for the human aspect of human being; ethical Taylorism reduces the human being to an economic or industrial agent. Think of the business organizations that embody this ethos; think, too, about the politics that follow from this reduction.

Forget wonder. Focus, instead, on success, only on what works, on being highly effective, even “hypersuccessful.” With doggedness and luck (Collins attributes much of his success to luck), things might work out for you. But — win or lose — the real trouble with ethical Taylorism is that it offers (at best) an impoverished idea of virtue or human excellence. Eventually, you’d think, the human will rebel, or wander from the plan of “creative” work into unfruitful and unscientific speculation on his Creator, or nap. I certainly hope so.

A Note on Thug Life

Thug enters the English language in the early 19th century as an import from colonial India. The original thugs, or thagi, were itinerant bands of thieves and marauders who strangled their victims to death. The practice was known as P’hansigár, from the Hindustani word P’hánsí, noose.

The methods of the thug are described by James Arthur Stevenson in an 1834 paper that he read before the Royal Asiatic Society:

The chief object in view is to lull their victim into a sense of security before they proceed to deprive him of life, which is…always effected by strangulation. When a favorable opportunity presents itself, one of the party throws a noose, which is made with a tightly twisted handkerchief, round the destined sufferer’s neck; an accomplice immediately strikes the person on the inside of his knees, so as to knock him off his legs, and thus throw the whole weight of his body on the noose; and a very few seconds puts an end to the unfortunate man’s struggles.

The victim would be buried and the booty sent back home.

Other writers at the time remark on how expert the thugs are in the art of deception, false friendship and smooth talk. Lawrence James, who describes the work of these “inveiglers” in Raj, says that “deception” (and only incidentally robbery and murder, I suppose) was “their trade.” Stevenson called them “the most decided villains that stain the face of the earth”; and by 1855, a British civil servant urged that the thugs ought to be considered “an infernal machine beneath the keel of the good ship government,” subversive to civilizing measures of the British colonial project.

And indeed they were, not just because they created mayhem and committed murder, but because the practice of thugee was, to use a fancy word for it, incommensurable with the moral outlook of the colonizers.

Thug life was governed by ritual and devotion to the goddess Kali. The corpses of victims were stabbed; the stabbing may have once involved more elaborate rituals of sacrifice. It seems, from British accounts, the thugs regarded their murders not as crimes but almost as a perfectly legitimate trade into which they were born, as the son of a blacksmith might regard work at the forge. They talked about themselves almost as members of a social caste.

When they were apprehended,

the thugs were unmoved by their fate; in one instance several under sentence of death sung cheerily on their way to the scaffold and hung themselves rather than die at the polluted hands of an executioner who was a leather dresser.

A correspondent named James Paton interviewed a band of captured thugs in 1836 and was shocked by the “relish and pleasure” with which they confessed to horrible crimes. They took pride in good kills, and did not venture out without performing oblations and ablutions and reading omens. They were hunters of men.

European accounts of thugs, beginning with Jean de Thévenot’s late seventeenth-century Voyages, would make a fascinating study in the “documentary” aspect of early anthropology. In the 1850s, the Italian photographer Felice Beato photographed a group of four thugs demonstrating the use of the P’hánsí. I found what I believe is a copy on Mike Dash’s site.