I am turning 60 this month, and I have to say that 60 has been occupying my thoughts quite a bit more than 50 did, or 40, or 30, or even 20 (which was the only birthday I remember being mildly traumatized by, because leaving my teen years meant that I was crossing over into a stretch of my life—a long stretch—during which much, much more would be expected of me).
So I’ve asked myself, What’s so special about 60? It seems to be something more than just being 10 years closer than I was at 50 to the last inning. Nor is it the fact that at 60, many of us feel compelled—I know I do—to think long and hard about the things that we haven’t done, or accomplished, or yet gotten around to, and about how to deal with the resultant emotions and sense of loss.
The whole ugh-I’m-really-getting-older package is, for me, only part of the story of hitting 60. As someone who enjoys reading, thinking, and occasionally writing about history, and as someone who also likes numbers, I find that I’m getting a big kick out of contemplating how much history I’ve lived through in 60 years. But I’m not talking about the usual touchstones that come up in connection with my born-between-1950-and-1955 segment of the baby-boom generation: civil defense drills (also known as “duck-and-cover exercises” or, simply, “bomb drills”) in school, Sputnik, hula hoops, the Pill, JFK, the Beatles, Vietnam, drugs, men on the Moon, Woodstock, Watergate, and so on.
I’m talking about, for example, the fact that at 60, I’ve lived through more than one quarter of our country’s history. That may be more a reflection of how young the United States is as a nation than a reflection of how old I am (as a typically self-obsessed boomer?), but I still find it striking. Striking, too, is the fact that 12 out of America’s 44 presidents have served in my lifetime, born as I was during Harry Truman’s second term.
And who was vice president during Harry Truman’s second term? I’ve asked this question of many people who were born when Truman was president, and only a couple have been able to come up with the name Alben W. Barkley. All of the 12 vice presidents who followed him are names that will be familiar to most native-born Americans my age, but Alben W. Barkley is a name that seems to come out of another historical era altogether. Americans over the age of 70 may not feel this way, but I think most of my chronological peers do.
I was born on May 22, 1952. There were 48 states in the Union. The end of the Korean War was still 14 months away. Senator Joseph McCarthy was nearing the zenith of his malignant influence; not until the Army–McCarthy hearings in the spring of 1954 did he cinch his image as that of a bully, a liar, and a fomenter of hysteria. Elizabeth II was Queen: some things never change.
The Queen notwithstanding, the world in 1952 was in many respects a stunningly different place than it is 2012. The population of the world is estimated to have been between 2.6 and 2.7 billion; today it is believed to be right around 7 billion. For the U. S., the figures are about 157 million in 1952 and just about double that—approximately 313 million—today.
The political map of the world has changed drastically. The disintegration of the Soviet Union, the reunification of Germany, and the breakup of Yugoslavia and Czechoslovakia occurred within recent memory. Less fresh in our minds are the political changes in Africa, which largely took place in the 1950s and ’60s. In 1952, there were only five sovereign states in Africa (Ethiopia, Liberia, Egypt, South Africa, and Libya); today there are 54. In Asia, newly independent states included the People’s Republic of China, India, Pakistan, Indonesia, the Philippines, Israel, and Jordan.
Joseph Stalin was Premier of the Soviet Union; he would die in March 1953. As of my birthdate, the Soviet Union was the only nation besides the United States that had detonated a nuclear weapon. Britain joined them in October of that year. A few weeks later, on November 1, 1952, the United States successfully tested the first hydrogen bomb.
When I began to research this essay, I knew that a handful of confirmed Civil War veterans were still alive in 1952; the last survivor was Union veteran Albert H. Woolson, who died on August 2, 1956 at what was then believed to be the age of 109 (it now appears more likely that he was a mere 106). (Readers may recall that the last surviving veteran of World War I, Florence Green of England, died on February 4th of this year at the age of 110.) I was surprised to learn that when I was born, John Dewey, George Santayana, Pop Warner, and Henri Matisse were alive. Names from another geological era! Perhaps I’m betraying my ignorance here, but I was genuinely surprised. (Dewey, it is true, died when I was 10 days old. He was 92.) And who knew that Laura Ingalls Wilder, Jean Sibelius, and W. C. Handy were still around in 1952?
As a baseball fan, I derive pleasure from the fact that my stint on the planet’s roster overlapped with that of Cy Young, Honus Wagner, Connie Mack, and Napoleon Lajoie. When I was born, the 16 major league teams included the Boston Braves, St. Louis Browns, Philadelphia Athletics, Brooklyn Dodgers, New York Giants, and Washington Senators. There were no major league teams on the west coast—in fact, none west of St. Louis, which makes it sound as if we’re talking about the era of Lewis and Clark, except that there were no professional baseball leagues in 1804. Even though Jackie Robinson had been playing for the Dodgers since 1947, only 5 other teams—the Indians, Browns, Giants, Braves, and White Sox—had employed an African-American player by the end of the 1952 season. That means 10 teams—the Athletics, Cubs, Pirates, Cardinals, Reds, Senators, Yankees, Phillies, Tigers, and Red Sox—had not. The last three did not integrate until the late ’50s, over a decade after the Dodgers.
In the market for a car in 1952? You could buy a DeSoto, Studebaker, Packard, Nash, Hudson, Kaiser, or Sunbeam, in addition to more familiar makes. In the market for a computer? The Pentagon got its first UNIVAC in June 1952, and in November of that year CBS used the fifth UNIVAC ever built (it was actually owned by the Atomic Energy Commission) to correctly predict the result of the 1952 presidential election. Given that UNIVACs weighed 29,000 pounds, they weren’t showing up on people’s desktops.
If few American families were seriously contemplating buying a computer in 1952—the cost was as prohibitive as the space requirements—most either owned, or dearly wanted to own, a television set. According to one estimate, 34% of American homes had a TV in 1952. Within five years, the figure was 79%, and by 1962 it was 90%. A small number of color televisions had been sold to the public in 1951, but they were quickly taken off the market, and for all practical purposes color TVs were not made available until 1954. Not until the early ’70s, however, did the number of color televisions sold annually in the U. S. move ahead of sales of black-and-white TVs.
In 1952, there were four television networks in the United States: CBS, NBC, ABC, and the now barely remembered Du Mont network. Du Mont shows included Captain Video and His Video Rangers, Bishop Fulton J. Sheen’s Life Is Worth Living, Cavalcade of Stars (whose host, Jackie Gleason, debuted “The Honeymooners” as a brief skit on the October 5, 1951 broadcast), The Morey Amsterdam Show, The Arthur Murray Party, and The Ernie Kovacs Show. Inventor Allen B. Du Mont had founded the network in 1946 as a way to sell television sets manufactured by his company, Du Mont Laboratories; the Du Mont network folded in 1956. I can’t say for certain that I recall watching any Du Mont shows—reruns of The Morey Amsterdam Show, maybe—but I’m pretty sure we owned a Du Mont set at one time.
Network television was less than a decade old in the U. S. when I was born, but it was already a well-entrenched phenomenon, a familiar part of American life, even if only a third of American families owned a TV. Rock ’n’ roll, on the other hand, was not a familiar part of American life in 1952; in fact, the question of what is older, rock ’n’ roll or me, does not have a simple answer.
If you ask people to name records from the early days of rock ’n’ roll, the ones most commonly mentioned are songs that were recorded and released after May 1952: “(We’re Gonna) Rock Around the Clock” by Bill Haley and His Comets (1954); Elvis Presley’s “That’s All Right” (1954), “Heartbreak Hotel” (1956), and “Hound Dog” (1956); Bo Diddley’s “Bo Diddley” (1955); Fats Domino’s “Ain’t That a Shame” (1955); Chuck Berry’s “Maybellene” (1955); Little Richard’s “Tutti Frutti” (1955); and Carl Perkins’s “Blue Suede Shoes” (recorded in 1955 and released in 1956; Elvis’s cover version was released a few months later). Respondents with a deeper knowledge of music history are likely to cite, among others, Big Mama Thornton’s original version of “Hound Dog” (recorded in August 1952 and released in 1953); Big Joe Turner’s “Shake, Rattle, and Roll” (1954); LaVern Baker’s “Tweedle Dee” (1954); and Ray Charles’s “I’ve Got a Woman” (1954).
But there’s more to the story. My invaluable guide to this murky territory has been a 1992 book, What Was the First Rock ’n’ Roll Record?, by Jim Dawson and Steve Propes. It’s out of print, which is a real shame. Dawson and Propes argue convincingly that, as musician/archivist Billy Vera says in his foreword to the book, “rock ’n’ roll was an evolutionary process—we just looked around and it was here, to paraphrase Dion. To name any one record as the first would make any of us look like a fool.” So the authors are not really trying to answer the unanswerable question posed by their title. What they do instead is give us 50 great little essays on 50 milestone records that made important contributions to the whole evolutionary process. All the records are from the years 1944 to 1956. Obviously, one could go back to the Edisonian beginnings of record-making—and before—in an unending search for influences, but the authors make a good case for starting in 1944.
The bottom line, at least for my purposes, is that 30 of the 50 songs chosen by Dawson and Propes were recorded and released prior to my May 22, 1952 birthdate. Does that automatically make me younger than rock ’n’ roll? Not in and of itself, but if you go by two firmly established (in my mind) philosophical criteria, to wit, 1) I may not be able to define it, but I know it when I see/hear/taste/smell/feel it, and, 2) If it looks like a duck, and it quacks like a duck, it’s probably a duck, well, then I think it’s incontestable that rock ’n’ roll is older than I am.
The evidence? Go to YouTube and check out the following records: “Good Rockin’ Tonight” by Wynonie Harris and His All Stars (recorded in 1947 and released in 1948); “Saturday Night Fish Fry” by Louis Jordan and His Tympany Five (1949); “The Fat Man” by Fats Domino (recorded in 1949 and released in 1950); “Lawdy Miss Clawdy” by Lloyd Price (recorded in March 1952 and released the following month); and, especially, these two: “Rocket 88” (written Rocket “88” on the label) by Jackie Brenston and his Delta Cats (sax player Brenston happened to do the lead vocals on this song and consequently was credited as the ostensible bandleader by Chess Records in Chicago—who released it in April 1951—but the band was in fact Ike Turner’s Kings of Rhythm; years before he met Tina, Ike was already a pivotal figure in what came to be called rock ’n’ roll); and “Rock the Joint” by Bill Haley and the Saddlemen (released in March 1952), who would later change their name to Bill Haley and His Comets and have a much bigger hit with a similar song. On the basis of the aforementioned waterfowl-inspired criterion #2, I’d say that “Good Rockin’ Tonight,” “Saturday Night Fish Fry,” “The Fat Man,” and “Lawdy Miss Clawdy” can stake a reasonable but admittedly arguable claim to being ducks; as for “Rocket 88” and “Rock the Joint”—well, if they ain’t ducks, the whole notion of duckhood needs to be redefined.
With that question settled—and I know you’re relieved that we finally did settle it—we can move on to three areas of even greater importance than rock ’n’ roll: health and medicine; race and civil rights; and the role of women in society. For each of these subjects, there’s a basic narrative that educated Americans are familiar with, and I’m not going to rehash those narratives in any detail here. I simply want to note some of the things that I am most amazed by as I look back and think about how the world has changed over these last 60 years.
In the U. S., 1952 was the peak year of the polio epidemic of the 1940s and ’50s—almost 58,000 cases were reported that year. That may not seem like a lot of cases in a nation of 157 million people, but anyone who is right around my age and who was born in the U. S. has heard stories about the hysterical—and probably disproportionate—fear that washed over the country from roughly 1946 to 1955. Then, on April 12, 1955, the results of the successful field trials of the Salk vaccine were made public. Within a few years, polio was a rarity in the U. S.
All of that is well known, but too easily forgotten. Today, looking back at that period, I’m struck by the fact that when I and all my childhood and college friends were born, and when my sister Donna was born (on February 27, 1955), our parents were understandably terrified that their kids might contract polio. Donna and I didn’t, and none of our close friends did either. But by the time my brother Rich was born on March 4, 1959, polio was practically a non-issue in this country. Some of those who did contract polio in the ’40s and ’50s—in the U. S. and elsewhere—include Alan Alda, Francis Ford Coppola, Judy Collins, Jack Nicklaus, Wilma Rudolph, Mitch McConnell, Joni Mitchell, Mia Farrow, Itzhak Perlman, Neil Young, and Donovan.
At the time of my birth, there had never been a successful transplant of an internal organ from one human being to another. A doctor in what is now the Czech Republic had performed the first successful human corneal transplant in 1905, but nothing comparable had been accomplished with kidneys, livers, hearts, lungs, intestines, the pancreas, or the thymus. On December 23, 1954, however, Joseph Murray, J. Hartwell Harrison, and their team at Boston’s Peter Bent Brigham Hospital (now Brigham and Women’s Hospital) successfully transplanted a kidney from one identical twin to another, and a new era in surgery was inaugurated. Doctor Murray shared the Nobel Prize in Physiology or Medicine in 1990 for this achievement; I am happy to report that he lives quite close to where I work and is doing well at 93.
I think one of the hardest things for Americans under the age of 40 to understand about the America of 60 years ago is the extent to which racial segregation and discrimination prevailed. On July 26, 1948—more than 15 months after Jackie Robinson’s first big league game—President Harry Truman signed Executive Order 9981, which was designed to end racial segregation in the armed forces. (The reality, though, was that all-black Army units were still in existence as late as 1954.) So the first two landmark events of the mid-twentieth-century civil rights era in America—Robinson’s debut and Executive Order 9981—happened before I was born.
Yet as of May 1952, across a huge geographical region of the U. S.—stretching all the way from the Delaware beaches to west Texas—public schools and public transportation; hotels, motels, restaurants and bars; movie theaters, sporting events, and concerts; and public swimming pools, bathrooms, and water fountains were segregated by race. This was true not only in the 11 states that had once formed the Confederacy; segregation was also a common practice in border states like Delaware, Maryland, West Virginia, Kentucky, Missouri, Oklahoma, and Kansas, and in the District of Columbia. Moreover, in the 1940s and into the early ’50s, there were segregated school districts scattered across the southern counties of 5 northern states—Illinois, Indiana, Ohio, Pennsylvania, and New Jersey—that bordered on the so-called border states. That’s a total of 23 states. Certainly, forms of de facto segregation obtained to varying degrees in all of the other 25. And then as now, racial discrimination—whether written into law or not—knew no geographic boundaries. But I’d say that of all the mind-blowingly (boomer adverb) strange things that we see when we look back at the America of 1952, the legally-mandated physical separation of the races in the southern and border states might just be the strangest.
Nearly as strange-seeming is the fact that when I was a kid, men everywhere took it for granted that since women were the “weaker” sex physically, it was therefore obvious that they shouldn’t have the same rights as men and should, in most regards, be treated as second-class citizens. It was equally obvious to these men, and to the boys who quite naturally were influenced by them, that those early “women’s libbers” of the mid-to-late ’60s were a bunch of bra-burning, man-hating, deeply deluded and deeply frustrated harpies.
When I was, say, 15, this point of view impressed me as being completely legitimate. Within about 10 or 12 years, though, I had become convinced that feminists had a darn good point and the sooner the world changed so that the full equality of women with men was universally recognized—and people tried to live their lives in conformity with that recognition—the better. For me, and for tens of millions of American men, arguments that had once seemed preposterously radical now made perfect sense.
We all know—at least we should know—that a vast amount remains to be done in the areas of racial equality and women’s equality here in the U. S., but a lot of movement in the right direction has occurred in my lifetime. With some other issues—income inequality; the rights of children, of gay and transgender people, and of the poor, the disabled, the elderly, and the incarcerated; hate crimes and bullying; global climate change; the depletion of the world’s fisheries; our excessive dependence on fossil fuels—it feels like we’re just getting started. But as Rosa Parks taught us, you have to start somewhere.
When I think about ways in which religious life in America has changed over the last 60 years, I’m fascinated by what a mixed bag the changes have been. On the one hand, the United States has shared in the worldwide upsurge in very conservative, even fundamentalist, religious belief and practice. Evangelical Protestantism is spectacularly vibrant; a “charismatic” renewal—incorporating elements of evangelical and Pentecostal faith and practice—is underway in Roman Catholicism; Orthodox Judaism is thriving (especially in its Hasidic varieties); and the term “American Islamists” is by no means an oxymoron.
Accompanying this swing toward conservative religion is the trend whereby many American politicians and other public figures seem to feel compelled to make Perry/Santorum/Tebow/Lin-like declarations regarding their faith. In a more disturbing development, conservative believers from a variety of religions and denominations have spearheaded an assault on Darwinian evolution that, here in 2012—87 years after the Scopes trial, 59 years after Watson and Crick discovered the structure of the DNA molecule, and 43 years after Armstrong and Aldrin first set foot on the Moon—frankly seems bizarre.
Yet we also live in a country in which certain kinds of public displays of Christianity are far less ubiquitous than they were in the 1950s. Nativity scenes on public land—in front of town halls, courthouses, post offices, public schools, etc.—while still permissible under certain circumstances, are nowhere near as common as they once were. We no longer have Bishop Sheen and Billy Graham crusades and Amahl and the Night Visitors on prime-time network television. As American Catholics are no longer expected to abstain from meat on Fridays (except during Lent and on Good Friday), public school cafeterias in heavily Catholic areas are now less likely to be serving fish for lunch on that day, and there are fewer Friday fish specials at restaurants (except in Wisconsin, where the tradition is apparently alive and well) and Friday evening parish-sponsored fish fries.
Speaking as a (nonobservant) Jew, I have to say that Christianity is, on the whole, less obtrusive in public life nowadays than when I was a child—except when certain high-profile politicians, athletes, and other public figures open their mouths. I would add that as someone who majored in religion at a Methodist university, got a master’s in Old Testament/Hebrew Bible at a divinity school, and married an Irish Catholic, I probably have a higher tolerance for public displays of the Christian religion than many American Jews do. What I do object to are the kind of displays that strike me as excessive, unnecessary, over-the-top—and no, I don’t have a problem with Friday night fish fries, or Saturday night fish fries for that matter.
Some additional comments about religion: When I was born, only a small number of Protestant denominations were ordaining women as ministers. Just one woman had been ordained as an Anglican priest (in 1944, in China), and there were no women rabbis anywhere in the world. Today, among the so-called mainline American Protestant denominations (I’m including the Episcopal Church here, though it has dropped the adjective “Protestant” in certain contexts while retaining it in others), and among all wings of Judaism except Orthodoxy, the ordination of women is a thoroughly noncontroversial issue. I don’t expect that the Roman Catholic Church or the Eastern Orthodox churches will be ordaining women to the priesthood in my lifetime, though. Nor do I expect to see any of the major branches of Orthodox Judaism ordain women as rabbis.
Mormonism presents an interesting case. The Church of Jesus Christ of Latter-day Saints does not have a professional clergy that is even remotely equivalent to the Roman Catholic, Anglican, and Eastern Orthodox priesthoods, the Protestant ministry, and the Jewish rabbinate. But Mormons do have a multi-tiered priesthood (defined by the church as “the authority and power that God gives to man to act in all things for the salvation of man”) that is open to “every faithful, worthy” male church member, age 12 and up. This was not always so: until 1978, the priesthood was not open to males of black African descent. In June of that year, however, the assembled leaders of the church received a revelation that the time had come to admit black males to the priesthood. Thirty-four years later, LDS women are still excluded from the priesthood, but the way in which the mechanism of revelation functions in the Mormon church leaves wide open the possibility that women will someday join their male brethren in being eligible for ordination. My guess is that this may well happen in my lifetime.
The Mormon church was founded in 1830, which means that I’ve lived through almost one third of Mormonism’s history. In 1952, there were about 1.2 million Mormons in the world; most of them lived in the western U. S., predominantly in a north–south corridor that ran from southern Idaho, down through the entire state of Utah, and on into northern Arizona. Today, the Utah-based Church of Jesus Christ of Latter-day Saints claims over 14 million members worldwide, and scholars of religion commonly (and justifiably) refer to Mormonism as a world religion. According to the U. S. Religious Landscape Survey of the Pew Forum on Religion & Public Life (based on data from 2007), the percentage of Mormons in America (1.7% of the total population) is now roughly equivalent to the percentage of Jews (also 1.7%), and far exceeds the percentage of either Eastern Orthodox Christians, Buddhists, Muslims, Hindus, or Jehovah’s Witnesses.
I’d like to close this essay with a grab bag of facts and observations that I was unable to shoehorn into previous paragraphs. I am mindful that I have occasionally juxtaposed the serious with the frivolous, but I’m simply trying to make an honest accounting of the kinds of things, both big and trivial, that jump out at me as I reflect on 60 years of history. Here goes:
In 1952, the total number of teams in the four major North American professional sports leagues (major league baseball, the NFL, NBA, and NHL) was 44. By 1960, the first year that I began paying close attention to pro sports, the number was 51. The addition of the new 8-team American Football League and the expansion Dallas Cowboys of the NFL made for a total of 21 pro football teams, up from 12 in 1952; in the same period, the NBA shrunk from 10 teams to 8. Today, there are 122 teams, and that’s not counting Major League Soccer, the Canadian Football League, or any of the women’s pro leagues.
In 1952, very few football players at any level had facemasks on their helmets, and not one NHL goaltender wore a goalie mask. Today, the powers-that-be at every level of amateur and professional football and ice hockey—and not just those two sports—are grappling in an unprecedented way with the issue of traumatic head injuries. Both football and hockey will see dramatic rule and equipment changes in the coming years, and the culture of both sports will have to be radically reformed if they are to survive into the 22nd century. I’m reasonably optimistic that this will happen. Boxing, on the other hand, should have been consigned decades ago to the scrap heap of permanently discontinued human endeavors.
When I was a kid in Brooklyn, fireflies—we called them lightning bugs—were a predictable and endearing feature of summer evenings. During the days, you’d see an occasional praying mantis. Now, I live just outside of Boston, and about the only time I see fireflies is when I go to Martha’s Vineyard in June. I don’t remember the last time I saw a praying mantis.
Today, with single-newspaper American cities increasingly becoming the norm, New York City, which has The New York Times, The Wall Street Journal, the Daily News, the New York Post, Newsday, the Staten Island Advance, and the weekly Village Voice, exists in a kind of time warp. But when I was growing up we also had the New York Herald Tribune, the New York Journal American, the New York World-Telegram and Sun, the New York Daily Mirror, The Long Island Daily Press, and, briefly, the resurrected Brooklyn Eagle. Perhaps we didn’t need so many newspapers, but it was wonderful to have such a range of choices.
Centenarians were exceedingly rare 60 years ago. Nowadays, most of us can name someone who lived to be 100: if not a relative or friend, then someone famous, like George Burns, Bob Hope (Dolores Hope too), Elizabeth the Queen Mother, Rose Kennedy, Irving Berlin, Strom Thurmond, Alf Landon, Madame Chiang Kai-shek, or Grandma Moses. Population statistics involving centenarians are notoriously unreliable for a variety of reasons, but let me cite some statistics anyway. One source has estimated that there were some 2,300 centenarians in the United States in 1950. The U. S. Census Bureau now estimates that there are over 70,000 centenarians living in this country. So over a period of time in which the nation’s population has doubled, the number of centenarians has increased thirtyfold, more or less. (I personally am not counting on having four more birthdays ending in zero.)
It has been estimated that roughly half, perhaps more than half, of all American adults were smokers in 1950. The most recent estimate by the Centers for Disease Control and Prevention is that 19.3% of all adults (age 18 or older) in the U. S. smoke cigarettes. No surprise there in terms of the drastic reduction in the number of smokers. But I’m amazed at how very few pipe smokers—smokers of tobacco—one sees today. The glories of the Internet notwithstanding, I was unable to find any statistics on pipe smoking in the United States. But when was the last time you saw someone smoking a pipe? In the 1950s, pipe smokers were not exactly clogging the sidewalks of New York, but one could hardly have called them a rarity. My Uncle Max smoked a pipe. Why, 96% of all male college professors . . . OK, I made that up. But you get my point.
Much has been written about the oppressive blandness of the American diet in the 1950s. The generalization is surely a valid one—it’s with good reason that foodies revere Julia Child and Alice Waters—but we sometimes forget about crucial exceptions. In ethnic communities, plenty of home cooks and restaurant chefs scrupulously upheld old-country traditions of flavorful food. Cookbook author and pioneer TV chef James Beard had an enormous following. Gourmet magazine had been around since 1941 and, beginning in 1957, Craig Claiborne transformed the food pages of The New York Times into an influential culinary and cultural resource. Yes, Swanson’s TV Dinners may have temporarily had the upper hand, but the August 1961 publication of Julia Child, Simone Beck, and Louisette Bertholle’s Mastering the Art of French Cooking would begin to change that.
I miss the expertly-made chocolate malteds and chocolate egg creams of my Brooklyn childhood. I miss watching baseball with my grandfather, and hearing him talk about the by-then-departed Brooklyn Dodgers. I miss Shari Lewis and Andy Devine and Paul Winchell. I miss Top 40 radio as it was in its glorious, hormone-soaked heyday. I miss the 9-planet solar system, but I understand why it had to go. Goodnight Pluto.
I do not miss the taste of Tang, Pez, or Lik-M-Aid. I do not miss silent prayer in school (led by our teacher, Miss McHugh, who unquestionably intended for us to pray to a different God than the God I was hearing about from Miss Stern at the local synagogue’s Sunday school). I do not miss Romper Room or Howdy Doody or Bozo. Especially Bozo.
Happy birthday to me, and to everyone else who has turned 60 in 2012, or will do so before year’s end. I hope some of my fellow nouveau sexagenarians will, sooner or later, get to read this essay. Remember, however old and decrepit we get, we’ll always be younger than rock ’n’ roll. To put it another way: before there was Sputnik, there was “Rocket 88.”