Some Sunday Stuff: July 23rd.
Zoe is 6!
Happy Sunday! It was quite the week in news- Kermit got fired, R. Kelly may be head of a creepy cult, the Juice is (about to be) loose, and Sean “Spicey” Spicer is out at The White House, disappointing legions of Melissa McCarthy fans (like me).
But in the De Freitas household, the biggest news of the week was Z’s sixth birthday. We’re thankful, amazed at how fast time has flown, and proud of our not-so-little girl. She’s been- and continues to be- a blessing from God.
Photograph by David Burnett / Contact Press Images
Now, time for links. First, this story on the life, afterlife and legacy of Bob Marley by Hua Hsu at The New Yorker, sent to me just this morning by my friend Kiki:
When Bob Marley died, on May 11, 1981, at the age of thirty-six, he did not leave behind a will. He had known that the end was near. Seven months earlier, he had collapsed while jogging in Central Park. Melanoma, which was first diagnosed in 1977 but left largely untreated, had spread throughout his body. According to Danny Sims, Marley’s manager at the time, a doctor at Sloan Kettering said that the singer had “more cancer in him than I’ve seen with a live human being.” As Sims recalled, the doctor estimated that Marley had just a few months to live, and that “he might as well go back out on the road and die there.”
Marley played his final show on September 23, 1980, in Pittsburgh. During the sound check, he sang Queen’s “Another One Bites the Dust” over and over. He asked a close friend to stay near the stage and watch him, in case anything happened. The remaining months of his life were an extended farewell, as he sought treatment, first in Miami and then in New York. Cindy Breakspeare, Marley’s main companion in the mid-seventies, remembered his famed dreadlocks becoming too heavy for his weakened frame. One night, she and a group of women in Marley’s orbit, including his wife, Rita (to whom he had remained married, despite it being years since they were faithful to one another), gathered to light candles, read passages from the Bible, and cut his dreadlocks off.
Drafting a will was probably the last thing on Marley’s mind as his body, which he had carefully maintained with long afternoons of soccer, rapidly broke down. Marley was a Rastafarian, subscribing to a millenarian, Afrocentric interpretation of Scripture that took hold in Jamaica in the nineteen-thirties. By conventional Western standards, the Rastafarian movement can seem both uncompromising (it espouses fairly conservative views on gender and requires a strict, all-natural diet) and appealingly lax (it has a communal ethos, which often involves liberal ritual use of marijuana). For Marley, dealing with his estate probably signified a surrender to the forces of Babylon, the metaphorical site of oppression and Western materialism that Rastas hope to escape. When he died, in Miami, his final words to his son Stephen were “Money can’t buy life.”
“This will business is a big insult,” Marley’s mother, Cedella Booker, told a Washington Post reporter in 1991, as his estate navigated its latest set of legal challenges. “God never limit nobody! Jah never make no will!” Neville Garrick, a close friend who designed many of Marley’s album covers, mused in the 2012 documentary “Marley” that it may have been the singer’s final test, one in which “everybody reveal who they really were, you get me? Who really did love him, who fighting over the money.” It would have been out of character for Marley to neatly divvy up his property. “Bob left it open.”
No one metric captures the scale of Bob Marley’s legend except, perhaps, the impressive range of items adorned with his likeness. There are T-shirts, hats, posters, tapestries, skateboard decks, headphones, speakers, turntables, bags, watches, pipes, lighters, ashtrays, key chains, backpacks, scented candles, room mist, soap, hand cream, lip balm, body wash, coffee, dietary-supplement drinks, and cannabis (whole flower, as well as oil) that bear some official relationship with the Marley estate. There are also lava lamps, iPhone cases, mouse pads, and fragrances that do not. In 2016, Forbes calculated that Marley’s estate brought in twenty-one million dollars, making him the year’s sixth-highest-earning “dead celebrity,” and unauthorized sales of Marley music and merchandise have been estimated to generate more than half a billion dollars a year, though the estate disputes this.
Inevitably, the contention over the estate mirrors the larger struggle over the legacy—over the meanings of Marley. The accounting of merchandise and money might feel like a distortion of Marley’s legacy, of his capacity to take the lives of those who suffered and struggled and turn them into poetry. But the range of Marley paraphernalia also illustrates the nature of his appeal. He became a way of seeing the world. Although he adhered to an ordered, religious belief system for most of his life, praising Jah, the Rastafarian name for God, whenever he could, he came to embody an alternative to orthodoxy. His lyrics lent themselves to a kind of universalist reading of exodus and liberation. He was one of the first pop stars who could be converted into a life style. Bob left that open, too.
Broad Street, Newark, 1940s (Image Source)
Read the rest here. On Friday, I wrote about the fiftieth anniversary of the Newark Riots. Now I’d like to turn to an earlier time in the city’s history, back in the 30s and 40s, in writer Philip Roth’s Newark. From City Journal, Steven Malanga writes:
In March 1969, 19 months after riots had devastated Newark, Philip Roth took to the New York Times with a plea to save the city’s library. In a budget-cutting move, the Newark City Council had proposed slashing funding for the library’s main branch and the adjoining Newark Museum, two notable landmarks constructed during more prosperous times in the early twentieth century. “When I was growing up in Newark,” wrote Roth, “we assumed that books in the public library belonged to the public. Since my family did not own many books, or have the money for a child to buy them, it was good to know that solely by virtue of my municipal citizenship I had access to any book I wanted from that grandly austere building downtown on Washington Street.” Soon after Roth’s appeal, city council members found the money to keep the library and museum operating, most likely because they never really intended to shutter two of the city’s signature institutions. The threat was likely a way to dramatize Newark’s accelerating decline.
Nearly five decades later, Newark is described as a failed city, “a classic example of urban disaster,” “the worst American city,” and “America’s most violent city.” One of Roth’s own characters, Swede Levov, from his 1997 novel American Pastoral, calls Newark a place that once “manufactured everything” but turned into “the car theft capital of the world.” Still, Roth, who left Newark in 1951 for college and a writing career that includes a novel titled Letting Go, has been unable to let go of the city. Last October, he announced that he was bequeathing his collection of 4,000 books to the Newark Library—recognition of the role that Newark played not only in his life but also in his work.
The city, as well as characters from it, has figured in more than half of Roth’s 27 published novels, in his two autobiographies, and in numerous essays and interviews. Roth’s Newark stretches from his first book-length work, 1959’s Goodbye Columbus, with its protagonist working at the Newark Library, to his most recent (and final, he says) book, Nemesis, a striking 2010 novel about a polio epidemic’s impact on his Weequahic neighborhood during the 1940s. There may be no more expansive portrait of an American city in literature by a single author. And because Roth grew up during the mid-twentieth century, his Newark tales also offer a chronicle of something that has largely vanished from America: a flourishing blue-collar urban environment, where immigrants pursue the American dream for themselves and, especially, for their children. The literary critic Leslie Fiedler, who also grew up in Newark, described the city as an ordinary place even in its best years, not magical like Paris or New York and “not even a joke like Peoria.” Yet in Roth’s hands, the city’s working-class neighborhoods become unforgettable landmarks. “Sitting there in the park, I felt a deep knowledge of Newark,” librarian Neil Klugman observes in Goodbye Columbus, “an attachment so rooted that it could not but branch out into affection.” Unless you grew up in Newark, as I did, it might be impossible to imagine that city. It still exists, however, in Roth’s words.
Downtown Newark, 1939 (Image Source)
Newark’s first established Jewish community comprised mid-nineteenth-century German immigrants. The Jewish neighborhoods that Roth knew in the 1930s and 1940s, though, were peopled by the great wave of Eastern European immigrants who arrived from the early 1880s through the mid-1920s—some 2 million Jews, largely from Russia and Galicia. Perhaps as many as 50,000 of those sojourners made their way to Newark. They lived initially in tenements in and around Prince Street, in the city’s old Third Ward. Among the residents: Roth’s paternal grandfather, who worked in the tanneries to put food on the table for seven kids, including Roth’s father, Herman. Then, as the immigrants’ children assimilated, they gradually moved west, forming a lower-middle-class ethnic stronghold at Newark’s edge—some 58 streets of apartment buildings, small houses, and bustling shopping thoroughfares known as Weequahic, an Indian phrase meaning “head of the cove.”
To the contemporary social scientist, indoctrinated in the virtues of diversity, a city of thick, somewhat insular, ethnic neighborhoods like Weequahic must seem like the antithesis of true community. But to the immigrants of Newark and their descendants, those neighborhoods offered a safe harbor, a place where they could grow up surrounded by familiar faces and with shared mores, even as they began to assimilate. Roth commonly describes Weequahic and Newark as a haven or an enclave. In his 1988 nonfiction book, The Facts: A Novelist’s Autobiography, he writes: “Our lower-middle-class neighborhood of houses and shops—a few square miles of tree-lined streets at the corner of the city bordering residential Hillside and semi-industrial Irvington—was as safe and peaceful a haven for me as his rural community would have been for an Indiana farm boy.”
Roth later recalls summer nights in Newark as he and his friends wandered his neighborhood: “When the weather was good we’d sometimes wind up back of Chancellor Avenue School, on the wooden bleachers along the sidelines of the asphalt playground adjacent to the big dirt playing field. Stretched out on our backs in the open air, we were as carefree as any kids anywhere in postwar America.” In The Plot Against America, a 2004 novel that imagines how Weequahic’s Jews might have fared if Nazi sympathizer Charles Lindbergh had been elected president in 1940, the young Philip, the book’s narrator, says of his street: “Tinged with the bright after-storm light, Summit Avenue was as agleam with life as a pet, my own silky, pulsating pet. . . . Nothing would ever get me to leave here.”
Some of the major players of “Mad Men” (AMC)
Read the entire essay here. Next, Sonya Saraiya argues at Variety that Mad Men is the last great show of TV’s Golden Age (H/T: Brit):
A decade after its premiere, “Mad Men” is still one of the leading lights of prestige television, a hall-of-fame show that put together an array of once-in-a-lifetime performances. On a more personal note, it’s also one of my favorite dramas; possibly, probably, my favorite drama ever. There are a lot of reasons why: Its historicity, its emphasis on New York City, its exploration of gender, its reckoning with the underpinnings of advertising. I’ve written about the show in appreciation, and so have many others.
Looking back on it 10 years later, what’s surprising is not its quality or the widespread nature of its phenomenon, but how the firmament of television around it has changed so much. The series was an inflection point, of sorts, in the contemporary history of the medium. Everything before it was one thing; everything after, another. When “Mad Men” aired its series finale in 2015, it ended not just its window into the tumultuous cultural history of America of the ‘60s but also a specific moment in television history.
You might call that moment the Golden Age of Television (though there already was a Golden Age, in the late ‘40s and ‘50s, so append “second” or “so-called” as you see fit; I like Alan Sepinwall’s term, a “big bang” of TV, where innovation and engagement expanded in seemingly every direction). Regardless of term, what I’m referring to is that much-discussed period when a combination of factors, many of them simply technological, led to several brilliant and well-made shows that altered the way audiences watched and thought about television. These were complicated and challenging dramas, often with material that had never before been depicted on television, which both moved away from the idea of TV as appealing to everyone and found success by pushing the envelope.
I’m not going to do a better job of recounting the history of television’s phase shift than Sepinwall, who has written a whole book on the topic. But let’s establish a few basic points. Even if “Oz” (1997) paved the way for David Chase’s show,“The Sopranos” (1999), it is generally agreed, is the show that kicked off what we know as the Golden Age. “Breaking Bad” — which started six months after “Mad Men” and ended two years earlier than “Person to Person” — marks the last show in the medium’s sudden transformation. Because “Mad Men” ended later — and because “Mad Men” ended, it felt, multiple times, with a protracted two-part goodbye where every scene felt like closure — it has the privilege and the curse of being the one to turn off the lights. At the point where “Mad Men’s” finale was slowly unfolding, it seemed bizarrely out of pace with the TV boom it had helped to spawn. While we were feverishly livetweeting it, the show seemed to move even more slowly, with a pooling energy that, looking back, is a close analogue to the bravura nothingness David Lynch displays in “Twin Peaks: The Return.” (Though the shows otherwise don’t have a lot in common, showrunner Matthew Weiner certainly has Lynch’s auteur aversion to spoilers, too.)
“Mad Men” has had a long and lingering aftertaste. Several of its stars have done little else since the show ended, basking in post-finale glow, and yet the drama is already celebrating its 10th anniversary. But in its quiet aftermath — for while some shows end with a bang, “Mad Men” ended with cosmic infinity — it took something with it, and that’s what prompts me to call it the last great drama of that era. When “Mad Men” debuted it was still astonishing that dramas would make bad men their lead characters. By the time it ended, critics and audiences were lamenting the glut of antihero stories. “Breaking Bad” was a brilliant show, of course. “Mad Men,” with its finale, made shows like “Breaking Bad” seem obsolete — made nearly any show about an antihero seem obsolete. Don Draper was the last antihero, and unlike several of his predecessors who either died brutally or left the careers that rewarded their twisted souls, he somehow found his way to inner peace in time to (probably) go back to McCann Erickson and write what creator Matthew Weiner calls “the best ad ever made.” “Mad Men” was a remarkable show in so many ways — a deeply stirring show, in so many ways — but perhaps its highest achievement is that because it so thoroughly interrogated its characters and its premise, it functionally (and politely) made itself obsolete.
Read the entire post here. And finally, this hilarious (to me) story of how Heinz trolled Chicago by Adam Moussa at Eater:
Chicagoans are famous for their love of hot dogs. Equally famous is their disdain for ketchup on said dogs. That doesn’t sit too well with Big Condiment, however, so this week Heinz dropped a new product on the market: Chicago Dog Sauce. The twist? This new offering is literally just relabelled ketchup.
It might have been a simple PR stunt, but Eater Chicago’s own senior editor kicked back hard. In his words, Chicago would never deign to “ruin the Mona Lisa with cheap red lipstick.”
Chester Bennington of Linkin Park (Image Source)
Today’s music pick is from Linkin Park, who lost lead singer Chester Bennington a few days ago. Linkin Park helped make up the soundtrack to my college years in the early aughts, and reading the news of Bennington’s suicide at age 41 threw me for a loop. Prayers up for his wife, kids, bandmates, friends and family who are in mourning, and may Light perpetual shine upon his soul.
Share your thoughts