Today is my close friend Randy’s birthday. I don’t think I even have the words to describe how cool he is. He’s like Manhattan cool, and not just because he worked for years as a chef at a glitzy eatery in the Big Apple. He’s just always been cool.
We met on the first day of classes at the now defunct Victory Christian Academy in 1997. September 4th, 1997, to be exact. It was also the day I met GiGi and Candace. I giggle aloud at the fact that I still am blessed to have them in my lives today.
Throw no shade at Randy’s soul patch or my dangerously thin eyebrows. It was 2001. Leave us be.
So in the spirit of all things Randy, here are a few links to some highbrow articles.
It’s true that people are less and less obsessed with the past, maybe especially (but not only) in America. We have no historical sense, a sense which could give us a sense of place, purpose, and limits, as well as the chastening that comes with reflection on experience. That’s both good and bad. But surely it’s good not be saddled with too much traditional baggage and ancient grudges.
We’re obsessed with the future for a variety of reasons. For one, we think the future is actually in our own techno-hands. We used to think “climate change” was up to nature, and now we think it’s up to us. That means we might have trashed the planet beyond repair. It might also mean we can save ourselves from our own mess if we act aggressively now. It might also mean that we’re developing the techno-means to bring the climate change—which used to be caused by chance and necessity—under our personal control. That would mean no more natural climate change—such as an inconvenient ice age—that we can’t believe in.
We’re also obsessed with our personal futures. That’s partly because we don’t often believe that our personal beings are sustained by the grace of a personal God. We believe, with the Epicureans, that biological death is the end of ME. But we’re less accepting of that fact. We believe that death is more in our control, because we know so much about risk factors we can avoid. And there is, of course, no one who obsesses more about his personal future than he who has the transhumanist hope for the Singularity and all that.
The trouble with being obsessed with the future of those not yet born or our species or all life on the planet is that we sacrifice the interests and enjoyments of people today to a rather speculative cause. Communism, everyone agrees, was evil because it justified terrorizing and slaughtering individuals we can actually see in the service of an imagined future paradise. Our obsessive futurology, arguably, is evidence that we haven’t yet learned enough from the monstrous failure of the ideologies of the 20th century.
There is, of course, a difference between obsessing about oneself and subordinating oneself to the interests of people or ecosystems or whatever not yet born. But, in both cases, we’re diverting ourselves from what we really know about ourselves and our limited future. Surely the authors are right that we shouldn’t let the future divert us from what is.
But there is something wrong, after all, about being Epicurean. Thomas Jefferson, for example, wrote in his most private letters that he was an Epicurean. That means, in a way, his teaching about natural rights was public and provisional, as was the depth of his concern about protecting the rights of others. Jefferson wrote eloquently about the violence slavery did to particular persons, but he wasn’t as obsessive as we might have hoped about keeping slavery from having much of a future in his country. According to Jefferson himself, what’s wrong with Epicurean thinking is it produces a selfish negation of the natural instinct we all have as beneficent social and relational beings.
So the least we can say is there’s a middle point between the indifference of Epicurean serenity and ideological obsession when it comes to controlling the future. Our concern for past and future is properly relational. It begins with our parents and our children and extends, if more weakly, to our ancestors and to the future of the children of our children. It also begins with everyone we know and love in our particular place. It might have begun with Jefferson’s concern for his own slaves, whom he knew well and even loved. It did begin (even if it didn’t go far enough), Jefferson wrote, with the effects of tyranny on the souls of his own children.
Thinking of the person in isolation and thinking of the abstraction humanity (or the abstraction species) are both pathologies of our very personal and excessively unrelational time.
What makes you who you are genetically? The easy answer is your family. The longer answer begins with the fact that all humans have two parents (at least for now), and usually four distinct grandparents (there are unfortunate exceptions). Genetically you are a recombination of four separate individuals. But that does not mean you have an equal contribution from four separate individuals. Humans normally carry 23 pairs of chromosomes: 22 autosomal pairs and one pair of sex chromosomes, either two copies of the X for a female or an X and a Y in the case of males. By Mendel’s law of segregation you receive one copy of each pair from your mother (via the egg), and one copy from your father (via the sperm). This means exactly half of your genome derives from each parent.
Things begin to get more complicated going back two generations. One might think that of the 44 autosomal chromosomes you would receive 11 from each of the four grandparents. (For simplicity we’ll leave the sex chromosomes out for now. If you are a female, you receive one X from each parent, while if you are a male you receive an X from your mother and a Y from your father, who got it from his father.) But while the proportion of one’s inheritance from parents is fixed by exact necessity, the fraction from grandparents is governed by chance. For each of the chromosomes you inherit from a given parent, you have a 50 percent chance of gaining a copy from your grandfather and a 50 percent chance of gaining a copy from your grandmother. The laws of independent probability imply that there is a 1-1,048 chance that all of your maternal or paternal chromosomes could come from just one grandparent! What’s more, genetic recombination means that chromosomes aren’t purely from one grandparent or the other; during the cell divisions that produce sperm and eggs, chromosomes exchange segments and become hybrids. You almost certainly have different genetic contributions from your four grandparents.
But this is not just abstract theorizing. Imagine that you could know that 22 percent of the genome of your child derives from your mother, and 28 percent from your father. Also imagine that you know that 23 percent of the genome of your child derives from your partner’s mother, and 27 percent derives from your partner’s father. And you could know exactly how closely your child is related to each of its uncles and aunts. This isn’t imaginary science fiction, it is science fact.
One might think these sorts of facts are useful only for the sake of satisfying curiosity, but sometimes theoretical knowledge can be put to practical use. Last spring my wife asked our pediatrician about testing my daughter for a treatable autosomal dominant condition which I happen to exhibit. The physician’s reaction was straightforwardly paternalistic. She would not authorize the test because she believed our daughter was too young. This did not sit well with my wife. The mutated gene which causes my condition has been well characterized. My wife went home and quickly used one of 23andMe’s features to find out if my daughter inherited a copy of the gene through me from my mother or my father. My mother is affected by the same ailment as I am, while my father is not. The happy ending is that my daughter almost certainly does not have the condition, because she inherited that genetic region from my father. The bigger moral of the story is that decentralized genetic information can allow persistence to pay off.
The writer had a problem. Books he read and people he knew had been warning him that the nation and maybe mankind itself had wandered into a sort of creativity doldrums. Economic growth was slackening. The Internet revolution was less awesome than we had anticipated, and the forward march of innovation, once a cultural constant, had slowed to a crawl. One of the few fields in which we generated lots of novelties — financial engineering — had come back to bite us. And in other departments, we actually seemed to be going backward. You could no longer take a supersonic airliner across the Atlantic, for example, and sending astronauts to the moon had become either fiscally insupportable or just passé.
And yet the troubled writer also knew that there had been, over these same years, fantastic growth in our creativity promoting sector. There were TED talks on how to be a creative person. There were “Innovation Jams” at which IBM employees brainstormed collectively over a global hookup, and “Thinking Out of the Box” desktop sculptures for sale at Sam’s Club. There were creativity consultants you could hire, and cities that had spent billions reworking neighborhoods into arts-friendly districts where rule-bending whimsicality was a thing to be celebrated. If you listened to certain people, creativity was the story of our time, from the halls of MIT to the incubators of Silicon Valley.
The literature on the subject was vast. Its authors included management gurus, forever exhorting us to slay the conventional; urban theorists, with their celebrations of zesty togetherness; pop psychologists, giving the world step-by-step instructions on how to unleash the inner Miles Davis. Most prominent, perhaps, were the science writers, with their endless tales of creative success and their dissection of the brains that made it all possible.
It was to one of these last that our puzzled correspondent now decided to turn. He procured a copy of “Imagine: How Creativity Works,” the 2012 bestseller by the ex-wunderkind Jonah Lehrer, whose résumé includes a Rhodes scholarship, a tour of duty at The New Yorker and two previous books about neuroscience and decision-making. (There was also a scandal concerning some made-up quotes in “Imagine,” but our correspondent was determined to tiptoe around that.) Settling into a hot bath — well known for its power to trigger outside-the-box thoughts — he opened his mind to the young master.
Anecdote after heroic anecdote unfolded, many of them beginning with some variation on Lehrer’s very first phrase: “Procter and Gamble had a problem.” What followed, as creative minds did their nonlinear thing, were epiphanies and solutions. Our correspondent read about the invention of the Swiffer. He learned how Bob Dylan achieved his great breakthrough and wrote that one song of his that they still play on the radio from time to time. He found out that there was a company called 3M that invented masking tape, the Post-it note and other useful items. He read about the cellist Yo-Yo Ma, and about the glories of Pixar.
And that’s when it hit him: He had heard these things before. Each story seemed to develop in an entirely predictable fashion. He suspected that in the Dylan section, Lehrer would talk about “Like a Rolling Stone,” and that’s exactly what happened. When it came to the 3M section, he waited for Lehrer to dwell on the invention of the Post-it note — and there it was.
Had our correspondent developed the gift of foresight? No. He really had heard these stories before. Spend a few moments on Google and you will find that the tale of how Procter & Gamble developed the Swiffer is a staple of marketing literature. Bob Dylan is endlessly cited in discussions of innovation, and you can read about the struggles surrounding the release of “Like a Rolling Stone” in textbooks like “The Fundamentals of Marketing” (2007). As for 3M, the decades-long standing ovation for the company’s creativity can be traced all the way back to “In Search of Excellence” (1982), one of the most influential business books of all time. In fact, 3M’s accidental invention of the Post-it note is such a business-school chestnut that the ignorance of those who don’t know the tale is a joke in the 1997 movie “Romy and Michele’s High School Reunion.”
These realizations took only a millisecond. What our correspondent also understood, sitting there in his basement bathtub, was that the literature of creativity was a genre of surpassing banality. Every book he read seemed to boast the same shopworn anecdotes and the same canonical heroes. If the authors are presenting themselves as experts on innovation, they will tell us about Einstein, Gandhi, Picasso, Dylan, Warhol, the Beatles. If they are celebrating their own innovations, they will compare them to the oft-rejected masterpieces of Impressionism — that ultimate combination of rebellion and placid pastel bullshit that decorates the walls of hotel lobbies from Pittsburgh to Pyongyang.
Those who urge us to “think different,” in other words, almost never do so themselves. Year after year, new installments in this unchanging genre are produced and consumed. Creativity, they all tell us, is too important to be left to the creative. Our prosperity depends on it. And by dint of careful study and the hardest science — by, say, sliding a jazz pianist’s head into an MRI machine — we can crack the code of creativity and unleash its moneymaking power.
Ha ha ha, the irony. Anyway, to my dearest Randy, the best Hungarian I know, happy 31st birthday. Love you lots.