50 Biggest Changes in the Last 50 Years - American Heritage
American Heritage 2004
Feb-Mar 2004
The 50 biggest changes in the last 50 years
Politics
by Terry Golway
With American Heritage approaching its fiftieth birthday in December 2004, we've asked five prominent historians and cultural commentators to each pick 10 leading developments in American life in the last half-century. We begin in this issue with Terry Golway—the political columnist for the New York Observer, whose books include Irish Rebel: John Devoy and America's Fight for Ireland's Freedom and So Others Might Live: A History of New York's Bravest—selecting the ten biggest changes in politics. In the next four issues we'll follow with our other authorities' choices of the half-century's biggest transformations in innovation and technology; business; home and the family; and entertainment and culture.
Unlike T. rex, communism, and your beloved local hardware store, clever politicians have little problem adapting to change, even the sort of precedent-shattering, go-where-no-human-has-gone-before change that might terrify most mortals. In fact, the craftiest politicians—the strongest, if you will—find ways to make evolution work for them. Franklin Roosevelt understood and harnessed the power of radio. The old urban machines reached out to immigrants in the late nineteenth and early twentieth centuries, and reaped the benefits. Andrew Jackson showed that in a raucous democracy, it helped to be a little raucous yourself.
Similarly, despite great changes in politics since 1954, politicians have adapted, and by any measure they appear to be thriving. They raise money through the Internet (thank you, Howard Dean). They embrace technology that allows them to track their popularity and perhaps—believe it or not—shape their beliefs on a daily basis. They understand the importance of including the formerly excluded. A half-century ago, who could have foreseen that a Republican President would one day appoint an African-American from the Bronx as Secretary of State, a job held in 1954 by John Foster Dulles? And they have shed their formality to better suit an informal age. We don't think twice when we see the President of the United States dressed in jeans, but just try to picture Harry Truman in a pair.
How many of these changes represent something new, and how many are simply variations on a theme? Ah, that is the question!
Nowadays it is common to read that the nation's political dialogue has become crude, vulgar, and even hateful. The bestseller lists are crowded with titles that accuse the President of being a liar and his critics with being traitors. This level of discussion, several commentators have suggested, is a dramatic change from the halcyon days, when debates were polite and Democrats and Republicans happily shared cocktails together after a long day of lawmaking.
Hmm. What would Abraham Lincoln make of this nostalgia for a kinder, gentler political debate, as he gazed at commentaries likening him to a monkey? Supporters of John Adams and Thomas Jefferson had some pretty strong words for one another during the campaign of 1800. And let us not forget that in 1954 the most dominant figure on Capitol Hill was a senator from Wisconsin named Joseph McCarthy.
While the tone of today's political debates certainly has an unfortunate edge, the coarsened discourse does not represent a revolution in American politics. This kind of change is not a tidal wave but merely ebb and flow.
With those caveats, here is one person's view of the ten most dramatic changes in American politics since 1954. If you disagree, call me any name you wish. We've heard it all before.
1. The Expansion of Voting Rights
The Voting Rights Act of 1965 surely fits the definition of revolutionary, once-in-a-lifetime change. In 1954, African-Americans in the South were utterly disenfranchised, sometimes through such devices as poll taxes and literacy tests, often through outright intimidation. Jim Crow was at its zenith, and Southern politicians were determined to keep it there. According to Justice Department figures, as recently as 1965 only 19.3 percent of eligible blacks in Alabama were registered to vote (the white figure was 69.2 percent). In Georgia, 27.4 percent of blacks were registered, as opposed to 62.6 percent of whites. And in Mississippi, an appalling 6.7 percent of blacks were registered, compared with nearly 70 percent of whites. For a black man or woman in the South in 1954, the glory of the ballot box was a cruel mirage.
The suppression of voting rights in the South was hardly a secret. It was the sort of injustice that mainstream politicians sometimes ignore, or, worse, indulge, for their own political reasons. But in 1965, a Democratic President from Texas, Lyndon Johnson, decided to put an end to the government's complicity in this outrage. He demanded, and in due course received from Congress, a voting rights bill that would demolish obstacles placed before black voters. The Voting Rights Act of 1965, enforced by the full weight of federal power, brought an end to the days of whites-only voting in the South. Within 25 years, black registration in seven Southern states (Alabama, Georgia, Louisiana, Mississippi, Virginia, and the Carolinas) was roughly the same as it was for whites. The number of black elected officials went from zero in 1960 to nearly 300 in 1992. And, by the 1990s, holdovers from the Jim Crow era of Southern politics found themselves in the unlikely position of courting black voters.
And there's more. This revolutionary piece of legislation continues to influence American politics, long after poll taxes and literacy tests were tossed into history's dustbin. The Justice Department aggressively monitors congressional reapportionment throughout the country, not just in the South, to make sure that gerrymandering does not dilute the voting power of minorities. That mandate flows from the Voting Rights Act, the single most important change in American politics since 1954.
2. Television
In 1954, it was still what the humorist Fred Allen called a piece of talking furniture. Politicians didn't know what to make of it, if they ever thought about it at all. President Eisenhower said he couldn't imagine anything more boring than watching himself on television. He wasn't kidding. Ike's TV appearances were made for radio.
Then, of course, came John F. Kennedy, tan, young, and handsome, and neither television nor politics has ever been the same. The familiar story of JFK's first debate with Richard Nixon in 1960 sums up the power of this new medium and the way it changed politics. Those who listened on radio thought Nixon was the winner; those who let their eyes do the thinking backed Kennedy. And we've been feasting our eyes ever since.
With the profusion of local cable channels and public-access programming, candidates for even the lowliest local offices must consider the power of TV. Presidential candidates began to adapt to the medium's demands in the 1960s; today, even candidates for state legislature or city council are coached to speak in sound bites and maybe drop a few pounds to look better for the cameras.
It is easy to bemoan television's influence for all the obvious reasons (will we ever elect another bald President, even if he happens to be a five-star general?). But those harsh studio lights also allow us to see our leaders up close and sometimes unscripted. Fifty years ago, politicians communicated with their constituents via letters and newsletters that were written by their staffs. Now when mayors, aldermen, and dogcatchers answer questions on live television, there is nothing between the viewer and the official's thought process. It is sometimes a scary prospect. But it is also illuminating.
3. The Success of the Conservative Movement
With the inauguration of Dwight Eisenhower in 1953, moderate Republicanism seemed triumphant. Robert A. Taft, the isolationist conservative from Ohio had been defeated at the 1952 Republican convention by the party's moderates and liberals. The New Deal would not be repealed; the era of consensus politics had begun. The postwar era would belong to internationalist, big-government Republicans like Nelson Rockefeller.
But then a dissenter from Arizona, Barry Goldwater, won the party's presidential nomination in 1964. He was defeated in a landslide, which was interpreted at the time as another repudiation of the Republican Party's right wing. Not exactly. In 1980, another politician from the Sunbelt, Ronald Reagan, defied expectations, upset the party's old guard (which supported the moderate, internationalist George H. W. Bush) and captured the Presidency. Conservatives were no longer mere political curiosities who read National Review. They were, in fact, mainstream politicians who clearly had a message millions longed to hear.
Reagan's election and the movement that supported him reordered the nation's political demographics. They created a new voting bloc known as Reagan Democrats. In the Northeast and the Midwest, these voters were, generally speaking, white, Catholic suburban homeowners, solidly middle-class and often members of labor unions. In the South, they were blue-collar white Protestants. Their parents and grandparents had been stalwart New Dealers from the old industrial cities, but by 1980, they were alienated from the party of their forebears. Ronald Reagan spoke to them in a way Democrats hadn't since Harry Truman. By 2000 they were no longer Reagan Democrats. They were simply Republicans.
4. The Decline and Fall of New York
Who would have predicted it in 1954? Just ten years before, in 1944, the Empire State had had a monopoly on presidential candidates: Both Franklin D. Roosevelt and Thomas E. Dewey were New York governors who rose to the top in part because of their state's extraordinary political power. It had had the nation's largest congressional delegation (and thus the most electoral votes), and a New York governor ran for President in every election from 1928 to 1948.
The nexus of national politics has moved from New York to the south and west
But no New Yorker has won a major-party presidential nomination since Dewey in 1948. The Empire State is now the third most populous state, and its delegation in the House has shrunk from 43 to 29. New York now has fewer electoral votes than it had in 1884 (when its 36 electoral votes were decisive in electing Grover Cleveland, another New York governor who made good). While New York remains a place candidates visit to collect campaign contributions, it is no longer the state parties look to for national leaders. The state's junior senator may yet reverse this trend, but then again, Hillary Clinton is something of a newcomer to New York.
5. The Rising Sunbelt
This change obviously is not unrelated to the two preceding ones. Reagan's election in 1980, the Republican take-over of Congress in 1994, and the nation's changing demographics have moved the nexus of national politics south and west. California, Texas, and Florida are the new electoral powerhouses, at the expense of New York and the industrial Midwest. Except for Michigan's Gerald Ford, who was never elected in his own right, every occupant of the White House since Lyndon Johnson has come from the South or the West—even that Connecticut Yankee from Texas, George H. W. Bush.
6. The Women in Office
The Washington that Harry Truman left in 1953 was a fraternity. The Washington presided over by George W. Bush includes a woman as National Security Advisor; women Supreme Court justices; cabinet members and members of Congress; a female Minority Leader in the House; and innumerable woman lobbyists, staff members, commentators, and reporters. And out in the provinces, women serve in unprecedented numbers as governors, mayors, state legislators, and local officials, positions that were, by and large, males-only in 1954. While many feminists would argue that real power remains in male hands—no woman has yet won national office or been appointed Chief Justice of the United States—there is no denying that women today have far more power and influence in politics than they did 50 years ago.
7. The Almighty Dollar
Yes, money has always had an important place in American politics. Yes, political candidates have always been dependent on the generosity of, er, public-spirited citizens with expendable incomes. But has money ever been more decisive than it is today, at all levels of politics? Probably not. Forget the extraordinary sums raised and spent on national campaigns, and consider the sums involved in local races. In New Jersey, for example, both parties raised and spent about $48 million in the state's off-year legislative elections in 2003; 20 years ago, they spent about $8 million on state legislative elections. In one state senate race, the winning candidate spent $212 per vote, according to the Star-Ledger. The importance of money manifests itself not only in election results but in the political culture. Officeholders and candidates, including the President, now spend far more time soliciting contributions than they did 50 or even 10 years ago. Between shaking contributors' hands and wolfing down rubber chicken, does anybody have time to think any more?
8. The End of National Conventions as We Knew Them
As this magazine noted nearly four years ago, national political conventions still serve a useful purpose. They are where delegates meet one another, they are where ambitious local candidates make their presence known to the national press, they are where a speech can make or break a career. An improperly managed convention can still lead to disaster. And, let's remember, the convention is where a party's vice-presidential nominee is introduced to the public.
That said, the convention just isn't the same and hasn't been since the 1950s. Nominees are selected not in back rooms, not on the convention floor, but in the presidential primaries. And even that is not entirely true. The nominee generally is chosen by late March, in a process that makes later primaries increasingly irrelevant. Gone are the days when Dwight Eisenhower could announce his candidacy in the very year he would stand for election, 1952. When Wesley Clark announced his presidential candidacy in the fall of 2003, most observers believed he was joining the fray far too late. Candidates need time to build organizations to contest the fateful early primaries.
Many political journalists still yearn for the days of dramatic conventions, and every four years somebody will write a speculative piece about a brokered convention. (Have you read about the scenario by which Hillary Clinton becomes this year's Democratic nominee without having entered the primaries?) It never happens. And it never will again.
9. The Demise of the Urban Bosses
Franklin Roosevelt had Ed Flynn of the Bronx by his side. Harry Truman dealt with the Pendergasts of Missouri before he was in the White House. Those were prototypical political bosses, men who ruled over political machines that knew how to turn out the vote. But the bosses, who demanded nothing if not loyalty, have been replaced by consultants for hire, who have applied modern marketing methods to political campaigns. So in place of Ed Flynn, Bill Clinton had Dick Morris, who worked for Republicans as well as Democrats. In place of the Pendergasts, George W. Bush has Karl Rove. The bosses had instincts; the consultants have data. The bosses delivered votes; the consultants deliver polls, focus groups, and pre-tested messages. The bosses lived for politics; the consultants could be selling anything.
If this sounds like a lament for the bosses, let it be noted that politics is a good deal more unpredictable, and more democratic, without them. The old bosses simply would not have allowed a one-term governor from Georgia to run for President in 1976.
Then again, no political boss would have conducted polls to help a President decide where to spend his vacation, as Morris did for Clinton.
10. The Baby Boomers
They changed everything else (or at least they think they did), so why not politics? The next time you hear a presidential candidate discussing his or her choice of underwear, you know who to blame.
End of the Solid South
Decline of the Cities
Revolution in Voting Behavior
Interstate Highway System
1900-1954
End of the Political Machines as television takes over.
America as a World Power
War and Cold War
Apr-May 2004
The 50 biggest changes in the last 50 years
Popular Culture
by Allen Barra
With American Heritage approaching its fiftieth birthday in December 2004, we’ve asked five prominent historians and cultural commentators to each pick 10 leading developments in American life during the last half-century. In this issue Allen Barra, American Heritage’s film reviewer and a wide-ranging historian and cultural critic, whose most recent books include Inventing Wyatt Earp: His Life and Many Legends and Clearing the Bases: The Greatest Baseball Debates of the Last Century, selects the 10 biggest changes in popular culture. In other issues this year our authorities offer their choices of the half-century’s biggest transformations in politics; innovation and technology; business; and the home and the family.
This essay began as a listing of the 10 greatest changes in popular culture in the past 50 years, but the more I mulled it over, the more I grew convinced that the discussion could have meaning only if it focused on people—artists and writers who either were at the forefront of change or best symbolized it. No one would deny that art and culture are the products of complex socioeconomic forces, but if they aren’t also shaped and formed by the personalities and talents of human beings, what is? Andy Warhol didn’t create pop art (well, he did, sort of, though somebody would surely have done it if he hadn’t), but he certainly created the look of pop art as we know it.
How does one measure the 10 greatest changes in popular culture over the past half-century? Well, how do you define “popular culture”? I ask only two things of popular culture: first that it be popular and second that it have something legitimate to do with culture. (Stephen King, for instance, has probably been the most popular novelist over this period, but it would be hard to make a case for his changing our culture.)
Culture takes in many different art forms, so I have tried to include people whose intelligence, creativity, and dynamism have effected change in jazz, rock, film, television, and pop art as well as literature and journalism. To say that I could have legitimately extended this list to 20 or even 30 names in no way lessens the impact of the 10 (or 11) offered here.
[Playboy (Sex), Drugs, Rock ‘n Roll, Middle Class prosperity, Freeway system and Travel], Suburbs
1 James Dean
Yes, Brando, of course. But James Dean popularized the same acting style and made it a focus for teenage rebellion. It could be argued, in fact, that as a beacon for teenage angst, his image predates rock ’n’ roll.
It could also be argued that by dying dramatically in 1955, he did as much for the next generation of actors as Brando did by living. Nearly every moody, sexy young actor to follow in his wake, from Paul Newman (who was cast in roles that would have gone to Dean) to Benicio Del Toro (who gave Dean’s style an ethnic flavor), owes him much.
2 Miles Davis
Let us count the ways in which he influenced popular culture. No jazz musician since Louis Armstrong has been so widely known or popular. For better or worse—you make the call —he pioneered jazz-rock fusion. And he became the first and only jazz artist to create an image of rock-star proportions.
Have we left anything out? Oh, yes, as the story goes, he told a middle-aged white woman at a party when she asked what he had done to be invited, “I changed music about five or six times. What have you done?”
3 Raymond Chandler
Most of Chandler’s short stories, books, and film scripts were written in the thirties and forties, though the script that was arguably his best, done for Hitchcock’s Strangers on a Train, wasn’t filmed until 1951, and his best novel, The Long Goodbye, was published in 1953. Still, although Chandler evokes the forties as does no other American writer, his real influence was to be on later decades.
Chandler perfected the American private-eye genre begun by Dashiell Hammett in the late twenties, but more important, his oeuvre helped create the look and feel of film noir that haunts Hollywood to this day. (See Memento and the film version of L.A. Confidential.) His influence can be perceived in such varied works as Antonioni’s Blowup (which hinges on a murder recorded on film, a play on Chandler’s Playback) and Blade Runner (which projects Chandler’s Philip Marlowe into a futuristic Los Angeles). Martin Scorsese got the title for Mean Streets from Chandler.
And, for what you might think it’s worth, Chandler has been cited as the grandfather of the graphic novel.
4 Pauline Kael
The greatest age of American film began in the late sixties and ran till the end of the seventies, and no one did more to create the intellectual climate that helped usher it in than Pauline Kael, who was the most visible and eloquent champion of Bonnie and Clyde, The Wild Bunch, M*A*S*H, Last Tango in Paris, The Godfather, The Godfather Part II, Carrie, Mean Streets, Nashville, and virtually every other American and foreign film with deep cultural impact in that period.
Kael was the first American writer to champion exhilarating domestic popular films over highbrow European cinema. “There is more energy,” she wrote in 1964, “more originality, more excitement, more art in American kitsch in Top Hat, Strangers on a Train, His Girl Friday, The Crimson Pirate, Citizen Kane, The Lady Eve, To Have and Have Not, The African Queen, Singin’ in the Rain, Sweet Smell of Success, The Manchurian Candidate, The Hustler and Hud than in the presumed ‘High Culture’ of Hiroshima Mon Amour, Last Year at Marienbad, La Notte, and The Eclipse.”
No other American critic of any art form, before or since, has had anything like her impact.
5 Buddy Holly
Well, yes, Elvis, of course, but American boys couldn’t realistically hope to grow up to be Elvis, whereas any horn-rimmed-glasses-wearing geek with a couple of friends and a garage to practice in might, in theory, become the next Buddy Holly. Holly and the Crickets were the first threeand four-piece rock band to go into a studio and record as such, and with the possible exception of Chuck Berry, who approached rock ’n’ roll from the other end of the color spectrum, no one did as much to fuse black blues and raucous white country music into the monster that would become rock ’n’ roll. Holly’s death in a plane crash in 1959 set the standard for rock-star denouements and inspired Don McLean’s “American Pie.” About a year after Holly’s death, some English kids in Liverpool got together and began looking for a name for their band that would remind people of the Crickets.
6 Andy Warhol
If we leave aside all debates based on purely aesthetic matters—and who knows whether the basis for such debates even exists today—Warhol’s influence swamps that of any 10 other artists you could name combined. As the art critic Dan Bischoff writes, “Andy Warhol was the most influential artist of the past fifty years … because with Warhol came the retirement of a whole raft of skills that artists had needed to have before in order to create art—the talent for painting, the talent for putting together complex images, arranging a sort of depth in two dimensions.” Warhol was the first artist for whom the subject became culture itself—that is, the popular culture of modern times, and not the way it was represented or rendered.
7 Frank Sinatra
So much has been written about Sinatra’s significance as a cultural icon that it’s easy to forget he was first a great popular artist. Sinatra was the first recording artist—and, arguably, the only one before the Beatles—to grasp the import of the album as the primary artistic and economic unit of pop music. He is the only pop singer of the fifties whose records have remained continuously in print, and no one, not even the Rolling Stones or Bob Dylan, has yet matched his longevity as an album artist.
8 Ernie Kovacs
Television is both a medium and an art form, and no one used the former to influence the latter like Kovacs, whose mix of video tricks, quick cuts, voice-overs, blackouts, and plain dark humor had a profound effect on everything from Laugh-In back in the sixties to today’s late-night talk shows and cutting-edge cable comedy. Kovacs wrote the grammar for modern television.
9 Norman Mailer
American culture became too fragmented in the second half of the twentieth century for any fiction writer to dominate his age the way Hemingway and Fitzgerald had done theirs, but even though he never wrote the great American novel, Norman Mailer may have had a greater impact on his time than Hemingway and Fitzgerald did on theirs. Who else produced so much vital work over so long a period or kept his finger on the pulse of the American heartbeat for so long? And if Mailer didn’t invent the new journalism, he was its most creative and prolific exponent.
“Literature,” Octavio Paz once said, “is journalism that stays journalism.” Mailer’s journalism has stayed journalism.
10 Francis Ford Coppola
Certainly there are directors with more box-office successes, but no one else in the history of American film has combined commercial success with critical acclaim as Coppola did in The Godfather and The Godfather Part II. No other American films added so many phrases to the lexicon; who among us has not, at one time or another, heard or used “I’m gonna make him an offer he can’t refuse,” “Luca Brasi sleeps with the fishes,” “Leave the gun, take the cannoli,” “It’s not personal, it’s strictly business,” “We’re bigger than U.S. Steel,” “Badda-bing!,” even “Forgetaboutit” (from a scene deleted from the original and restored by Coppola for the TV version).
The Godfather films made enduring stars of Al Pacino, Robert De Niro, Robert Duvall, Diane Keaton, and James Caan. Coppola’s great set pieces, the opening wedding scene and closing baptism scene in the first film and the Havana sequence from the second, have been imitated endlessly. Perhaps more than any other American movies, Coppola’s masterpieces have justified the concept of the extendedversion DVD.
And without Francis Ford Coppola, there would be no Sofia Coppola, one of the most interesting new directors of the twenty-first century.
And, just for good measure …
11 Malcolm X
Forget, for a moment, the political and religious aspects of Malcolm’s life, and consider just his cultural impact. Without Malcolm, Cassius Clay would never become Muhammad Ali. Richard Pryor would probably not become the greatest standup comic who ever lived. Alex Haley would not cowrite Malcolm’s autobiography and be inspired to ponder his own roots. Spike Lee, perhaps, would never become a filmmaker. In the pungent phrasing of the jazz critic Gene Seymour, “For better and for worse, the whole culture of grievance with attitude, of shock and all rhetoric in public discourse does not exist without Malcolm X, whether in hip hop nation or even in talk radio.”
Jun-Jul 2004
The 50 biggest changes in the last 50 years
Business
by John Steele Gordon
With American Heritage approaching its fiftieth birthday in December 2004, we’ve asked five prominent historians and cultural commentators to each pick 10 leading developments in American life during the last half-century. In this issue John Steele Gordon, American Heritage’s “The Business of America” columnist and the author of An Empire of Wealth: The Epic History of American Economic Power, 1607–2001, which will be published in October by HarperCollins, selects the 10 biggest changes in business. In other issues this year our authorities offer their choices of the half-century’s biggest transformations in politics; popular culture; innovation and technology; and the home and the family.
In 1954 the gross domestic product of the United States—the sum of all the goods and services produced in the country that year—was about $380 billion. In 2003 it was $10.9 trillion, more than 28 times as great in nominal terms. Even allowing for the very considerable inflation in the last 50 years, the economy is roughly 6 times as large as it was when American Heritage made its first appearance. So the biggest change in American business in the last 50 years has been, simply, the growth of the American economy as a whole.
But how all that wealth is created—who creates it, and by what means—has changed almost beyond imagining. The reason is plain enough: the computer. It is the most profound technological development since the steam engine ignited the Industrial Revolution two centuries ago, perhaps since the agricultural revolution ignited civilization itself 10 millennia ago. None of the biggest changes in business in the last 50 years would have been possible—or would have evolved as they did—had it not been for the computer. So while it easily ranks as the most important change, the computer, in truth, is behind nearly all the changes.
1 The Computer
Look at a photograph of a typical office of the mid-fifties and one of 2004, and the difference is instantly obvious: Every desk in the office now has a computer on it. Today half of American workers use computers on a daily basis in their jobs; in 1954 perhaps one-tenth of one percent did. Moreover, not just office workers use computers. Farmers, garage mechanics, dentists, lumberjacks, and a thousand other job categories as well now utilize computers in the daily course of business for purposes unique to each occupation.
To be sure, in 1954 computers were already making inroads into American business, especially in areas where data processing was very intense, such as banking and insurance. But they were huge and hugely expensive, kept in special air-conditioned rooms and tended by men in white coats. Very few Americans had ever actually seen one. Today about the only way for an American not to see one every day would be to stay in bed with the lights off.
The difference is the development, beginning in 1969, of the microprocessor, essentially a dirt-cheap computer on a silicon chip. A little more than a decade later, the calculator had sent the slide rule into oblivion, word processing had made the typewriter a relic, and Apple Computer had introduced the personal computer.
2 Globalization
In 1954 American exports totaled less than $14 billion, or 3.7 percent of GDP. In 2001 exports amounted to $729 billion, or 7.2 percent of GDP. Fifty years ago the American economy was effectively an island. The only great power whose industrial base had been strengthened, not diminished, by World War II, the United States was still self-sufficient in all but a few commodities (such as tin). Almost all the cars on the road in 1954 were, in their entirety, American-made by the Big Three auto companies, plus American Motors. What foreign automobiles there were, were mostly in niche markets, such as sports cars.
Today that is but a distant memory. American Motors is long gone, and the Big Three have only a little over half the American automobile market, about what GM had all by itself 50 years ago. The number one best-selling car in this country is not a Ford or Chevrolet; it’s the Japanese Toyota Camry. But many of the “foreign” cars on the roads today are in fact manufactured in the United States, and many of them are subsequently exported to other countries.
Automobile companies no longer have nationalities, except perhaps in terms of the locations of their corporate headquarters and greatest concentration of stockholders. Ford now owns Sweden’s Volvo and Britain’s Land Rover; GM owns Saab; Chrysler is part of the German Daimler. Every major automobile company manufactures parts or assembles vehicles in many countries. General Motors, with 15 percent of the global vehicle market, manufactures in 32 countries and sells in 192. The other great auto companies are equally dispersed.
That is increasingly true of companies in other lines of business, such as electronics and computers. It is also becoming true of retail companies. Wal-Mart, the world’s largest company in terms of gross revenues (if it were a sovereign nation, it would have the world’s thirtieth-largest economy), sells in 10 countries and buys products in many more. McDonald’s and KFC peddle their wares from Bangor to Bangalore, from Peoria to Paraguay, while American shopping malls are full of foreign products and foreign companies selling them.
3 Communications
In 1950 about a million overseas phone calls originated in the United States. In 2001 the number was a staggering 6.27 billion. In 1954 only radiotelephony, with a very limited capacity, was available. Today a cat’s cradle of undersea cables together with communications satellites provides nearly limitless capacity. In 1954 you needed a reservation to make an overseas phone call, and it would likely have cost a significant percentage of your weekly wage. Now it often costs less to call London than it did to make a local call 30 years ago. When you dial an 800 number, you may well find yourself talking—absolutely free—to someone in India.
Even domestically the increase in communications has been profound. You cannot walk down a city street, or even a supermarket aisle, without seeing people talking away on cell phones. Airports are full of folks working on laptop computers, connected wirelessly to their companies’ computer networks, while they wait for their flights. The increase in executive efficiency as a result has been enormous (and the ability to “get away from the office” much diminished).
4 The Financial Revolution
Many Americans in 1954 still handled their financial affairs largely in cash. They received their pay in cash, and they paid their bills in cash. The reforms of the New Deal had ended the fear of banks’ collapsing, so many families maintained savings accounts to safeguard their rainy-day funds, but far fewer had checking accounts or ready access to bank credit.
In 1951 a Long Island banker named William Boyle invented the credit card. It was a classic capitalist win-win-win situation: Credit cards allowed merchants to avoid the expense and risk of maintaining charge accounts; they gave banks handsome profits on unpaid balances; and they spread credit, formerly reserved largely to the affluent, to a whole new class of consumers. By the 1960s credit cards were common. Today they are ubiquitous, with 1.2 billion in use in the United States in 2002 by 190 million cardholders. Thanks to credit cards—and their latter-day descendants debit cards—cash is rapidly disappearing from American retailing.
Other parts of our economy’s financial sector have also grown beyond all expectations in the last 50 years. In 1954 the Dow Jones Industrial Average finally topped its 1929 high of 381.17. Today the Dow Jones stands more than 26 times higher than it did then. In 1954 there were 115 mutual funds in operation in this country, with investments worth $6.1 billion. In 2002 the number was more than 10,000 mutual funds controlling $7 trillion in capital. The percentage of people owning stocks and mutual funds has grown explosively as well, with over half the population directly holding financial securities. Many more have interests in pension funds.
Capitalism has become truly democratic in the last 50 years, and it is getting more so every day.
5 Management and Labor
In 1954 more than a third of all American workers belonged to unions, mostly of the old-fashioned blue-collar variety. In 2002 only about 14 percent did. But that doesn’t tell the whole story, for nearly half of today’s union members are government employees, such as teachers and hospital workers, virtually none of whom were unionized in 1954.
Meanwhile, the number of strikes has greatly diminished. In 1960 there were 222 work stoppages involving more than 1,000 workers, with 13,260,000 workdays lost. In 2002 there were only 9 such strikes, with 660,000 lost workdays, although the size of the American work force has doubled in the last 50 years.
6 Productivity
Part of the reason for the decline of the labor movement is the shift from manufacturing to services as the major source of jobs in the American economy. The United States has not stopped making things (total manufacturing output grew by more than a third between 1990 and 2001) but is becoming ever more efficient at it, thanks to the rapidly increasing use of computers in the process.
The recent history of manufacturing in this country is very similar to the longer history of agriculture. Farm production has steadily increased, while the percentage of the population living on farms has steadily declined, as has the percentage of GDP that is derived from agriculture. That trend did not stop in the last 50 years; it accelerated.
In 1954, 11.6 percent of the population lived on farms. Today less than 2 percent does. The amount of land devoted to agriculture has also declined, from 1.16 billion acres in 1954 to 941 million in 2001. In 1954, 82 million acres were planted in corn; in 2002 the number was down to 79 million. Yet corn production went from close to 3 billion bushels to 9 billion. Wheat acreage has held steady at about 60 million, but production has gone from 984 million bushels in 1954 to 1.6 billion in 2002.
7 Women
In 1954 the typical American woman was a housewife. That is certainly no longer the case, with more than 60 percent of American women in the work force. Moreover, women in business are no longer confined to the steno pool by any means. (The steno pool, of course, disappeared years ago.) In 1967 Muriel Siebert became the first woman to own a seat on that ultimate male bastion the New York Stock Exchange. All major corporations now have female executives, half the Forbes 500 companies have female corporate officers, and eight have female CEOs. There is no question that these numbers will rise as talented women who started working in the last few years reach their career peaks.
While the feminist movement has had a powerful influence on the attitude toward women in business in the last 50 years, there has been another major factor in increasing the number of women in business rather than the home: domestic productivity. Greatly improved and much more widely distributed household appliances—washing machines, dryers, stoves, dishwashers, microwave ovens, icemakers, vacuum cleaners—make housework far less time-consuming than it was 50 years ago, while the proliferation of prepared and semiprepared foods has made it much easier to get a meal on the table. In 1954 only the new TV dinner was available as an alternative to a home-cooked meal. Today there are hundreds of options, and some of them actually taste good.
8 The Imperial, and Imperially Compensated, CEO
A few months before American Heritage first appeared, the last comedy by George S. Kaufman (in collaboration with Howard Teichman) opened on Broadway, and two years later it was made into a highly successful film. It was titled The Solid Gold Cadillac, and it told of chicanery in high corporate places. But while recent scandals have shown that management wrongdoing is still alive and well in American business, if no worse than it was in the past, management compensation has gone through the roof.
Charles E. Wilson, the chief executive of General Motors before becoming Secretary of Defense in 1953, was paid $652,000 a year, plus some stock options (he took a $580,000 pay cut when he left GM to head the Pentagon). That was a tidy sum in the economic universe of the early 1950s, even though 91 percent of it was taxed away by the federal government. In 2002 the CEO of General Motors was paid more than $12 million in total compensation. He gave over a maximum of 35 percent in taxes, but because much of that compensation came in the form of stock options, he actually paid far less tax than that. Many CEOs did a lot better.
9 Antitrust
Antitrust was one of the big political issues in the 50 years before American Heritage was born, but it has nearly disappeared in the half-century since. One reason, to be sure, is that mere bigness is no longer perceived as inherently bad, especially as more and more Americans have become stockholders and thus more inclined to see things from the capitalist point of view.
More important, however, has been the accelerating change in the economy caused by the microprocessor, and the glacial pace at which antitrust suits necessarily run. When the outgoing Johnson administration sued IBM under the antitrust statutes in 1969, the company’s dominance over the American computer industry resembled Standard Oil’s over the petroleum industry 70 years earlier.
But by the time the government abandoned the suit, in 1982, IBM’s hegemony was a distant memory, and it was facing the most difficult decade of its corporate existence. Rapid technological change has proved a far more efficient policeman of the marketplace than any army of antitrust lawyers.
10 The Internet
The Internet is a communications medium and very much part of the communications revolution. But it is so new—barely a decade old as a popular medium—and so fundamentally important, that it deserves an entry all its own. As the railroad was to the steam engine, so the Internet is to the microprocessor, the most important spinoff of the basic technology. What is perhaps most impressive about it is that it erupted into existence almost spontaneously. Railroads had to be built with iron and wood and sweat. They were very expensive. The Internet costs so little to operate that almost anyone can have a Web site. That is why there are now about four billion Web sites in existence, and tens of thousands more are added every day.
The Internet allows people with common interests to find one another easily, including buyers and sellers. Thus it performs much the same function as a broker. That, in turn, means that all traditional brokerage businesses—real estate agencies, stockbrokerages, auction houses, travel agencies—must change fundamentally or go out of business.
The news business as well is changing rapidly because of the Internet. Bloggers and Internet journalists like Matt Drudge (who uncorked the Monica Lewinsky scandal) can respond to breaking news much faster than can newspapers and TV-news organizations (although now all major news organizations are on the Web as well). And because a Web site is so cheap to set up and operate, every news organization now finds its mistakes and biases mercilessly revealed by what the New York Times columnist William Safire has dubbed the “gotcha! gang.” Retailing as well is moving to the Web, growing at about 30 percent a year. This is very bad news for the printers who produce catalogues and the post office that delivers them.
As we look back on the past half-century of business in America, we see not only change—our restless country has always offered that—but something truly singular, change on a vaster scale than has happened during any 50-year period since the lookout on the Santa María first sighted land.
Jet Plane, Satellite Technology
1900-1954
Corporation
Corporate Mergers & Acquisitions
Automobile
Electricity
Telephone
Nuclear
Aerospace
Aug-Sep 2004
The 50 biggest changes in the last 50 years
The Fifty Biggest Changes in the Last Fifty Years
The Home and Family
Paul Berman
With American Heritage approaching its fiftieth birthday in December 2004, we’ve asked five prominent historians and cultural commentators to each pick 10 leading developments in American life during the last half-century. In this issue Paul Berman, a contributing editor to The New Republic and the author of Terror and Liberalism, published by W. W. Norton & Company, selects the 10 biggest changes in the American home and family life. In other issues this year our authorities offer their choices of the half-century’s biggest transformations in politics, popular culture, business, and innovation and technology.
What have been the 10 greatest changes in American home and family life during the last half-century? I think the first of these changes has turned out to be the deepest of all—the change that set into motion all the other changes, the prime mover. This was, oddly enough, the change mandated by the Supreme Court in its 1954 ruling on …
1 Brown v. Board of Education.
The Brown decision ordered the end of racial segregation in the public schools, on the ground that racial segregation means racial hierarchy, and government-sanctioned racial hierarchy runs counter to the democratic spirit of the Constitution.
You may ask, What has this got to do with families and the home? Everything, oh, everything, in my view. But in order to explain why I think so, I must defer to one of the greatest authorities on family life who ever lived—Honoré de Balzac. In the series of novels and novellas he called The Human Comedy, Balzac catalogued the changes that had overtaken French family life during his own time, the early nineteenth century. These changes were vast. And in Balzac’s judgment, they were horrendous. Daughters became contemptuous of their fathers (Le Père Goriot). Sons were careless of their family’s hard-earned wealth (ibid.). Homosexuals inflicted crime on the rest of society (ibid.). Cousins were monstrous (Cousin Bette). Husbands were indifferent to the material wealth of their own families (ibid.). Wives were unfaithful (practically the entire Human Comedy). And so on. And what was the ultimate source of these many dismaying changes, the moral catastrophe of French family life?
Balzac thought he knew. The ultimate source of the many disasters was the beheading of King Louis XVI in 1793. Until that moment family life in France, as Balzac imagined it, had floated serenely through the waters of a well-ordered society. Fathers and husbands ruled with a firm, just, and loving hand. Wives were obedient, pious, helpful, and ardent. Children loved and obeyed their parents. Cousins were un-monstrous. All society followed the pleasing customs of fidelity and morality, and these excellent customs were aromatized by a delicious feeling of passionate love in correct and Church-sanctioned ways. The social classes upheld the principles of mutual responsibility and honor. And all this, the splendid orderliness of a well-organized society, rested on a single foundation, which was the principle of duly-constituted, legitimate authority. This was the principle of social rank and hierarchy. It was the principle of nobility and of upper nobility—the principle, finally, of monarchy.
Alas! In 1793 the great diabolical crime was committed. The guillotine blade descended, the king’s head was severed from his body, and society was likewise severed from its legitimate governing principle. All hell thereupon broke loose, in Balzac’s view. The sacred bonds of family life disintegrated. Crime triumphed over duty. And Balzac, wide-eyed in astonishment, his curly hair standing on end at the mere thought of how dreadful were the scenes around him, dipped his pen into the inkwell and set out to record the scandalous consequences.
Balzac’s estimation of the French Revolution and its results is not universally shared. Some people have pointed out that monarchy had its shortcomings, feudalism was not everything it was cracked up to be, the Rights of Man was good, and the French Revolution was, all in all, a worthy project. This has always been my own judgment on French history. I take a sans-culotte-ish view of these things. It was a pity about the king and his head. But the ancien régime had to go. Still, I grant that Balzac did notice something important. He correctly understood that the most intimate details of family life rest in mysterious ways on the largest and most public of political principles. He noticed that a change in the foundation of political principles may well wreak considerable changes in the intimate regions of family life.
But enough about the France of long ago. What about America? In our country we never did have much of a feudal past, except here and there, ages ago. Nor was our Revolution anything like the one in France. Nor have we ever had a king of our own. We do have Presidents. But we have never had to behead any of them, though the temptation to do so has sometimes been great. Yet we did in the past have a firmly mandated and legally binding principle of social authority, which somehow or another dominated every phase of social life. This was the principle of racial hierarchy, a principle that descended into modern American life from the slavery of yore, the principle that put white people at the top and black people at the bottom.
In 1954 the Supreme Court decreed an end to that principle. Everyone knows that Brown v. Board of Education did not exactly bring about a revolution in boards of education all over America. Schools remained segregated just as before, and white schools tended to be better, and black schools worse, and the realities of racial hierarchy never did come to an end. Then again, in France beheading the king did not exactly put an end to the realities of social hierarchy either. Even today a surprising number of top figures in French life remain people with a de in their names, signifying aristocracy, as in Dominique de Villepin, the former foreign minister. Still, beheading the king back in 1793 did bring to an end an important principle—namely, the principle of monarchy, therefore of authority as a whole, taken in its ancient feudal version. Brown v. Board of Education did the same, in an American version. The decision brought an end to the principle of racial hierarchy and therefore to the many other kinds of authority that were somehow linked to the principle of racial hierarchy. And what was the effect on American life?
Balzac figured that all hell broke loose in French families after 1793, and many a commentator has concluded that all hell likewise broke loose in American families after 1954. Divorces increased. Promiscuity blossomed. Single motherhood flourished. People took drugs. Children became disrespectful. Homosexuals became much more visible. Homosexuals got married, which they had always done, but now they began to get married to one another. I could go on with the list of horrors, as seen by those who think the list is horrific. But I should like to argue, instead, that from a sans-culotte point of view, the changes that swept across American life in the wake of Brown v. Board of Education have been by and large salutary. The ancien régime in America may have been the good old days for some people, but not for most. The end of the principle of hierarchy in racial relations brought about a thousand changes in American life, among them five hundred alterations in the American family. And these alterations had their positive aspect.
I will list the best, the most admirable, of the changes, beginning with No. 2 because I have already cited Brown v. Board of Education as change number one. The others, in my view, have been:
2 Women became freer
to pursue careers outside the home and therefore realize their own talents, and thus advance the whole of society.
3 Men became freer
to appreciate the full talents of women. (One of the lamest reasons ever put in print).
4 Men and Women became freer (Better lovers, or more frequent lovers, multiple partners, multiple spouses, no marriage)
to become better lovers, I am convinced, because of their greater freedom to be themselves.
5 Parents became more sensitive
to the peculiarities and needs of their children, instead of merely demanding blind obedience. (His parents, maybe)
6 A new sense of honesty arose
that has permitted modern society to take a firmer line against certain kinds of crime—against child molestation, for instance, and against rape. (Incidences are going up. What kind of firmer line was taken)?
7 Marriage:
Young people were no longer pushed into too-early unions. (Pushed into any unions).
8 Homosexuality
came to be looked on by a great many people as an ordinary sexual orientation, instead of as something shameful, sinful, et cetera.
9 Gay marriage:
Homosexuals began to be accepted, in a trend that has lately led, through a process that began with Brown v. Board of Education, to the dawn of legal recognition of gay marriage here and there around the country. And, finally …
10 Tolerance:
The country became a little more tolerant and a little more protective of the right to privacy, as shown by Bill Clinton’s political victory over the many censorious busybodies who tried to have him removed from office in the aftermath of his White House affair.
Freedom, personal growth, sensitivity, amorousness, honesty, tolerance, privacy—these are the salutary changes that have overtaken the family and the home in the years after Brown v. Board of Education. My own guess is that French family life took a turn for the better after the French Revolution, in spite of Balzac. American family life is better, I say, after the civil rights revolution. Let Balzac and the reactionaries beat their nostalgic drums in outraged dismay. Let them count up the numerous downsides. I shall study their books. Some of those books will make a terrific read, I’m sure. Balzac himself is one of the greatest writers who ever lived. Even so, progress is a good thing.
Boy, that’s a pretty lame contribution compared to the other three. People that didn’t live in urban areas or the south wouldn’t even have known about Brown if it weren’t for the riots, North and South. And they came much later. And the same goes for the the gay issues. They have fled the small towns for life in the big cities.
I would add:
Birth control pill
Abortion on demand
Reduction in the size of the family from 3-5+ children to two or less (preferably, one boy and one girl).
The automobile gave children freedom from their parents.
Fast food restaurants and the microwave removed the need for a kitchen in the home.
Extra-curricular activities in the schools leading the way towards the loss of the supper hour and mom and
dad learning to be chauffers.
Professional sports and gambling
Casino gambling
Drug culture
Oct-Nov 2004
The Fifty Biggest Changes in the Last Fifty Years
Innovation and Technology
by Phil Patton
With American Heritage approaching its fiftieth birthday in December 2004, we asked five leading historians and cultural commentators to each pick 10 leading developments in American life in the last half-century. In this fifth installment, Phil Patton—whose books include Made in USA: The Secret History of the Things That Made America and Bug: The Strange Mutations of the World’s Most Famous Automobile—selects the 10 biggest changes in the realm of innovation and technology. In previous issues we presented our other authorities’ choices of the half-century’s biggest transformations in politics, business, home and the family, and entertainment and culture.
“I can’t imagine how we lived without it.” So we often say about an innovation that has changed our lives. But about the changes that have been most deeply absorbed into the pores of daily routine, we could also often say, “I can’t remember how we lived without it.”
My finger no longer retains the muscle memory of a rotary dial phone. I can no longer remember walking over to a television set to change the channel. When I think of slipping into the back seat of my father’s Oldsmobile, I falsely remember fastening a seat belt. Old television shows are magically remembered in color, and when I recall typing college term papers in the early 1970s, I do so on a click-clacking plastic computer keyboard rather than a massive metal Royal.
Such distortions may be the very definition of what has changed the world most. The year 1954 saw the arrival of the first solar cells, developed at Bell Labs. Boeing was testing a prototype of the 707, the intercontinental jet air-liner that would so change patterns of travel and consumption. Elvis was cutting his first records. And computers were just starting to be connected by telephone lines in the creation of the Cold War SAGE air defense system. The broader implications of that development were hardly imagined.
[There was a SAGE operation at the Duluth Air Base. Semi-Automatic Ground Environment, or something like that.
Big, tall concrete building, still there, very close to Hwy 53; don’t remember the cross road]. It was part of the North
American Air Defense Command (NORAD) and the DEW (Distant Early Warning) Line to detect Soviet bombers
coming over the North Pole to take out Duluth (and other places).]
The impact of some innovations, such as jet planes, has been striking in its predictability. But small innovations have wrought surprisingly large and unexpected changes in daily life too. Here are enough innovations, large and small, to count on all 10 of what used to be called digits —your fingers.
1 Getting the Dish: The Power of the Satellite
It was all there in Arthur C. Clarke’s famous article “Extra-Terrestrial Relays” in Wireless World magazine in October 1945. Inspired by the discovery of German V2 rockets, which he believed could serve as boosters, Clarke proposed launching earth satellites into geo-synchronous orbit to handle radio, telephone, and television communications. By 1962 Telstar was beaming TV images between Europe and the United States.
Clarke understood that building ground networks no longer made economic sense, a truth realized as countries all over the Third World leapfrogged straight to wireless phones and satellite TV. The echoes of that article are still resonating in such events as Rupert Murdoch’s installation as the TV baron of China. Satellite phones remain challenged by cost and power demands, but their potential impact was illustrated a few years ago by the poignant final moments of a trapped Mount Everest climber phoning his wife with his last words and more recently by the pixelated pictures from the Iraqi war front generated by satellite phones.
In the western North Carolina valley where my ancestors lived for a century and a half, television reception was long limited by the mountains, and the population was too poor and too sparse to justify investment by cable companies. My cousins and neighbors could see only two fuzzy channels before the arrival of the TV satellite dish. But then this area of Appalachia quickly came to have a remarkably high number of the dishes. Now the mountaineers can keep up with gossip about Hollywood stars as easily as with that about their cousins in the valley.
2 The Silicon Frontier: Technology as Manifest Destiny
We’ve all heard by now of Moore’s Law, the dictum laid down by the Intel cofounder Gordon Moore in 1965 that holds that the number of transistors and therefore the capacity of a silicon chip must rise exponentially. The Intel 8088 processor in the first IBM PC had 29,000 transistors. Today’s Pentium 4 has up to 178 million.
The importance of Moore’s Law, however, lies not just in what chips have done better and better—like running automobile engines more efficiently, regulating the browning of toast, and printing professional-looking flyers for the high school dance—but also in the pace at which their power has advanced, as relentlessly as did the frontier in the nineteenth century. Because of this, marketing and sales staffs have been able to set up a steady pattern of declining prices and new fashions in technology. “Adoption curves” have shot upward on the chart of time. Today’s cutting-edge device for the “early adopter” is tomorrow’s, or even today’s, strip-mall commodity.
Technical advances just over the horizon are like the empty lands of the nineteenth century. Exploitation of the manifest destiny of silicon has reinforced all the patterns of the Old West: speculation, competition, shootouts, and boomtowns and ghost towns.
3 Laser “Lite”
For those of us who grew up on the promise of the laser as a powerful ray gun, slicing up steel plate and boring holes through stone, the unexpected turn has been instead the spread of the low-power, low-cost laser.
It comes as no surprise that Boeing wants to mount antimissile lasers on jets, but it’s astonishing that the soldier in the field can pick out targets with his red laser pointer —and the regional sales manager can target data on his PowerPoint presentation with a pocket-size version of the same thing. We might have guessed that lasers would reshape the corneas of the myopic, but who would have anticipated the laser in a $30 device at the local Wal-Mart playing music or movies from discs?
4 Pumping Calories: The Heat Pump
At Seaside, the planned town in the Florida Panhandle built in the 1970s to elaborate the ideas of the New Urbanism, the architecture melds old Charleston galleries with bungalows and farmhouses in an American village so archetypical it was used as the backdrop for the film The Truman Show. Picket fences are required by town ordinance. But look behind the fence of the majority of houses in Seaside, and you’ll encounter the jarring sight of a mechanical minitower—a heat pump.
The heat pump changed Everytown, U.S.A., and helped create the sunbelt |
The heat pump changed Everytown, U.S.A., and helped create what we began in the early 1970s to call the Sunbelt. The device was developed just after World War II by Professor Carl Nielsen of Ohio State University and an engineer named J. Donald Kroeker, whose engineering firm installed the first commercial unit in the Equitable Building in Portland, Oregon, in 1948. Heat pumps were soon to be found in motels across America.
Basically air conditioners that can be reversed to provide low-demand heating systems, they made life tolerable in the Sunbelt, and at low cost. The heat pump removed the need for radiators or vented-air heat in much of the southern half of the country while supplanting the window-installed air-conditioning unit. It has flourished everywhere cooling is more important than heating and has supported our national dependence on low energy prices to make life sustainable in our fastest-growing areas.
5 The End of the Blues? The Mechanical Cotton Picker
The mechanical cotton picker killed Broadway, believes Jimmy Breslin. By driving poor blacks off the fields of the South to “Trailways and Greyhound bus depots for the long ride to New York City,” he argues, it sent blacks moving “into the tenements that were vacated by whites,” who themselves moved to the suburbs and abandoned Times Square. “Broadway would no longer be the place of guys and dolls.”
The migration of African-Americans north and west out of the South is the greatest in American history, larger than that from the Dust Bowl to California. Cotton-picking machinery, pioneered in the 1930s by the brothers John and Mack Rust, was mature by the late 1940s, but not until 1960 was a majority of the cotton crop harvested by machine.
The cotton picker soon became a key focus for historians studying the interaction of social and technological forces. The debate is charted in The Second Great Emancipation: The Mechanical Cotton Picker, Black Migration, and How They Shaped the Modern South, by Donald Holley. Did the migration of workers out of the South trigger the adoption of the picker and push the maturation of its technology? Or did the machine displace the workers? Did the appeal of greater freedom and prosperity in the rest of the country pull people off the land and into cities? Or did the disappearance of an agricultural society create a classic displaced proletariat?
What is not in doubt are the consequences: The growth of frequently depressed inner-city neighborhoods and expanding suburban ones, and the transformation of the blues, in its new homes in Chicago and elsewhere, into rock ’n’ roll and hip-hop.
6 Bar Codes and the Universal Product Code
Scanning your own groceries and avoiding the gum-chewing gossiping checkout girl may be worth it for you, but it’s even more worth it for the supermarket, with its just-in-time inventory. Much of America’s recent productivity growth has been built on new sets of standards and means of marking products. The bar code is the most visible example of this.
The Universal Product Code was the first bar-code symbology widely adopted, endorsed by the grocery industry in 1973. Product coding allows for quick price changes and has abetted the growth of the big-box discount store. Items can be tracked from port to rail to loading dock to shelf, thanks to containerized shipping that uses the codes. The consequence is lowered living costs.
Bar codes are just one of many industry standardizations that have lowered costs and changed life. The American home has doubled in average square footage thanks in large part to standardized building materials (4-by-8-foot gypsum board and plywood, 2-by-4 studs 16 inches apart). Electronics is built on standards such as Windows compatibility, VHS, DVD, and so on. Coded product standards even rule the food in our kitchens. A banana that was once just a Chiquita is now a #4011.
7 Buckle Up: The Automobile Seat Belt
Can you recall a car without a seat belt? The movement to put seat belts in the car began in 1954, when the American Medical Association first recommended them. Ford and Chrysler began to offer them as options a year later. By 1965 they were standard.
The push by safety advocates to require seat belts helped establish the adversarial relationship between government and the automobile industry, which was accelerated by the Clean Air Act of 1970. Detroit grumbled, but the engineering achievement involved in developing the catalytic converter and the air bag, both of which Detroit argued were impractical, suggested that under pressure industry could do far more than it thought. For historians, the story indicated how effective “force fed” technology, demanded by government, could be. For philosophers, it challenged John Stuart Mill’s classic liberal precept that government should not protect the individual from himself. Harley-riding libertarians, agreeing with Mill, have forced a rollback of mandatory helmet laws in some states. Will belt laws be unbuckled next?
8 Seeking Heat: the TV Remote Control
Today’s children watch television in a wholly different way from those of the 1950s. The remote control makes television an environment to be moved through, not a schedule of successive programs. The result is grab-’em-quick programming and short attention spans. Once families clustered together to watch Ed Sullivan. Now a program waited for and seen straight through is the exception rather than the rule.
While scientists at the remote Naval Ordnance Test Center at China Lake were developing infrared heat-seeking guidance for the Sidewinder air-to-air missile in the early 1950s, TV designers were struggling to find a way to change channels from a distance. The first remote control, still wired to the set, bore the apt name Lazy Bone. In 1955 a Zenith engineer named Eugene Polley did away with the wire; his Flash-matic used light, but it didn’t work very well, so it was replaced by the Space Command, which relied on ultrasound—frequencies beyond the range of the human ear. The sounds were generated mechanically in a system that was part chime, part tuning fork, because batteries were inadequate to power a wireless electric ultrasound system.
Not until the 1980s did cheap and dependable infra-red technology take over. Today 99 percent of all TV sets come with remote controls, and restless fingers seek hot news and hot new stars unceasingly.
9 …And Going and Going: Better Batteries
We forget how much bigger and slower our portable devices used to be. Remote controls and mobile phones and Game Boys have become possible only with improvements in batteries. Hefty boom boxes are loaded with companies of chunky C cells, but hearing aids, watches, and automobile key fobs contain tiny button batteries that often outlast the devices they power. The change began with the introduction of alkaline and nickelcadmium cells in the 1960s. Later decades saw nickel metal hydrides and then lithium produce order-of-magnitude extensions in battery life. But there have been tradeoffs. Most of the substances that make the best batteries are environmental hazards. Nickel, mercury, cadmium, and other heavy metals tossed into landfills and incinerators are among the most dangerous sources of pollutants. And while cell phones can remain on standby for weeks, running a laptop for a whole airline flight across the United States remains a challenge. The hope? That in the future miniature fuel cells will replace batteries altogether.
10 Scoop! The French Fry Is King
In 1954 the first TV dinner arrived. It was a turkey-and-dressing meal packaged in a segmented foil tray in a box printed up to look like a television screen. Frozen industrialized dinners heated in the home kitchen looked like the culinary future. But in 1955 Ray Kroc began the national franchising of McDonald’s and signaled a different pattern, the industrialization of the restaurant kitchen, with machinery and methods allowing the use of untrained labor. More and more meals would be eaten outside the home as standardized chains spread.
Kroc’s kitchen engineer, James Schindler, first broke down the burger production system, the way Henry Ford had broken down auto manufacturing. Then he refined it, the way Toyota had with its just-in-time automaking. Nothing better exemplified the system than the engineer Ralph Weimer’s fry scoop, a metal device that, when slipped onto a waxed bag, measured out an order of fries with a single unskilled swipe.
McDonald’s success has turned less on burgers than on fries, and the fries in turn have depended on a whole supporting infrastructure. As critical to McDonald’s as Ray Kroc himself was the spud king J. R. Simplot, who produced Idaho russets with just the right water and sugar content for proper caramelizing in cooking fat with just a touch of beef lard added. And the potatoes created by a vast growing, freezing, and transportation network end up in the hands of the worker wielding the scoop.
The scoop is an apt symbol of the power of the franchise itself, the business-in-a-box approach that has sprinkled monad-like restaurants and clothing stores across America and the world in the last half-century. What McDonald’s pioneered has been carried out by Starbucks and the Gap and other chains. The colored signal lights that regulate restaurant machinery, the step-by-step photos on training charts in fast-food kitchens, and the just-in-time shelf arrangements at Gap stores—all are exact counterparts of elements in modern automobile factories.
In the franchise nothing is left to chance—or to sheer stupidity. Not long ago, after happily munching our Roy Rogers burgers, we smoothed out the wrapper to discover a small circle printed on its interior. Inside the circle were printed the words PLACE SANDWICH HERE.
Advances in Medical Technology
The Pharmaceutical Industry
Food technology, number of products in the supermarket
Designer babies: in vitro fertilization. The possibility of cloning.
0 Comments:
Post a Comment
<< Home