• It is far, far easier for me to catalog the various things I’ve been wrong about:
  • But what about the things we’re all wrong about? What about ideas that are so accepted and internalized that we’re not even in a position to question their fallibility? These are ideas so ingrained in the collective consciousness that it seems foolhardy to even wonder if they’re potentially untrue. Sometimes these seem like questions only a child would ask, since children aren’t paralyzed by the pressures of consensus and common sense. It’s a dissonance that creates the most unavoidable of intellectual paradoxes: When you ask smart people if they believe there are major ideas currently accepted by the culture at large that will eventually be proven false, they will say, “Well, of course. There must be. That phenomenon has been experienced by every generation who’s ever lived, since the dawn of human history.” Yet offer those same people a laundry list of contemporary ideas that might fit that description, and they’ll be tempted to reject them all.
  • I’m more fixated on how life was another four hundred years before that. Here was a period when the best understanding of why objects did not spontaneously float was some version of what Aristotle had argued more than a thousand years prior: He believed all objects craved their “natural place,” and that this place was the geocentric center of the universe, and that the geocentric center of the universe was Earth. In other words, Aristotle believed that a dropped rock fell to the earth because rocks belonged on earth and wanted to be there.
  • Had this been explained to those people in the fourteenth century with no understanding of science—in other words, pretty much everyone else alive in the fourteenth century—Newton’s explanation would have seemed way, way crazier than what they currently believed: Instead of claiming that Earth’s existence defined reality and that there was something essentialist about why rocks acted like rocks, Newton was advocating an invisible, imperceptible force field that somehow anchored the moon in place.
  • Now, there’s certainly a difference between collective, objective wrongness (e.g., misunderstanding gravity for twenty centuries) and collective, subjective wrongness (e.g., not caring about Moby-Dick for seventy-five years). The machinations of the transitions are completely different.
  • We live in an age where virtually no content is lost and virtually all content is shared. The sheer amount of information about every current idea makes those concepts difficult to contradict, particularly in a framework where public consensus has become the ultimate arbiter of validity. In other words, we’re starting to behave as if we’ve reached the end of human knowledge. And while that notion is undoubtedly false, the sensation of certitude it generates is paralyzing.
  • I would go a step further than Schulz; I suspect most conventionally intelligent people are naïve realists, and I think it might be the defining intellectual quality of this era.
  • But my personal characterization of naïve realism is wider and more insidious. I think it operates as the manifestation of two ingrained beliefs: “When considering any question, I must be rational and logical, to the point of dismissing any unverifiable data as preposterous,” and “When considering any question, I’m going to assume that the information we currently have is all the information that will ever be available.”
  • We have no idea what we don’t know, or what we’ll eventually learn, or what might be true despite our perpetual inability to comprehend what that truth is.
  • Charlie Gillett, a British musicologist best known for writing the first comprehensive history of rock music (1970’s The Sound of the City), somehow managed to outline the fall of the music industry in detail without any possible knowledge of MP3s or file sharing.
  • As far as I can tell, no one in the entire Book of Predictions assumed the friction between the US and Russia could be resolved without the detonation of nuclear weapons.
  • Yet as recently as twenty years ago, this question still mattered; as a college student in the early nineties, I knew of several long-term romantic relationships that were severed simply because the involved parties attended different schools and could not afford to make long-distance calls, even once a week.
  • This brand of retrospective insight presents a rather obvious problem: My argument requires a “successful” futurist to anticipate whatever it is that can’t possibly be anticipated.
  • Klosterman’s Razor: the philosophical belief that the best hypothesis is the one that reflexively accepts its potential wrongness to begin with.
  • people continue to say they listen to “records” and “albums” and (on rare occasions) “LPs” whenever they’re describing any collection of music.
  • it seems impossible that we’ll ever stop using that term, even if the future equivalent of a “book” becomes a packet of granulized data that is mechanically injected directly into the cerebral cortex.
  • What any singular person thought about Moby-Dick in 1851 is as irrelevant as what any singular person thinks about Moby-Dick today. What critics in the nineteenth century were profoundly wrong about was not the experience of reading this novel; what they were wrong about was how that experience would be valued by other people.
  • But it does reflect something telling about the modern criteria for quantifying art: Symmetrical representation sits at the center of the process. It’s an aesthetic priority.
  • The reason something becomes retrospectively significant in a far-flung future is detached from the reason it was significant at the time of its creation—and that’s almost always due to a recalibration of social ideologies that future generations will accept as normative.
  • (it served as a trippy entry point for the notion that we already live in a simulated world, directly quoting philosopher Jean Baudrillard’s 1981 reality-rejecting book Simulacra and Simulation).
  • So think how this might alter the memory of The Matrix: In some protracted reality, film historians will reinvestigate an extremely commercial action movie made by people who (unbeknownst to the audience) would eventually transition from male to female. Suddenly, the symbolic meaning of a universe with two worlds—one false and constructed, the other genuine and hidden—takes on an entirely new meaning. The idea of a character choosing between swallowing a blue pill that allows him to remain a false placeholder and a red pill that forces him to confront who he truly is becomes a much different metaphor. Considered from this speculative vantage point, The Matrix may seem like a breakthrough of a far different kind. It would feel more reflective than entertaining, which is precisely why certain things get remembered while certain others get lost.
  • we’re building something with parts that don’t
  • But there are different possibilities that are harder to parse. There are stranger—yet still plausible—outcomes that require an ability to reject the deceptively sensible. What if the greatest writer of this generation is someone who will die totally unknown? Or—stranger still—what if the greatest writer of this generation is a known figure, but a figure taken seriously by no one alive (including, perhaps, the writer in question)?
  • Time is a motherfucker and it’s coming for all of us,” Lethem notes).
  • He represents the Platonic ideal of the tortured genius who dies virtually unknown: He was paralyzed by both a hatred of his own writing and a buried arrogance over his intellectual superiority.
  • Some estimates suggest he burned 90 percent of what he wrote. Yet the 10 percent that survived is the apotheosis of dreamlike fiction, to the point where his surname has become the adjective describing that quality.
  • Kafka is the easiest example of a canonical writer whose life ended in anonymity, and (as Lethem notes) the uniqueness of his trajectory might be too sublime to happen again.
  • The fact that we know that Kafka’s brilliance was not recognized during his time on earth magnifies his existential despair in a way that words alone never could. And we believe his voice can be trusted, because he (seemingly) had no ulterior motive. He was just typing into the abyss.
  • Howard Zinn’s 1980 depiction of how America was built in A People’s History of the United States is no longer a counterbalance to a conventional high school history text; in many cases, it is the text.
  • Competing modes of discourse no longer “compete.” They coexist.
  • So what does that tell us about our Contemporary Kafka? It tells us that Contemporary Kafka will need to be a person so profoundly marginalized that almost no one currently views his or her marginalization as a viable talking point.
  • But here’s where we taste the insecure blood from Klosterman’s Razor: The mere fact that I can imagine this scenario forces me to assume that it won’t happen. It’s a reasonable conclusion to draw from the facts that presently exist, but the future is a teenage crackhead who makes shit up as he goes along. The uncomfortable, omnipresent reality within any conversation about representation is that the most underrepresented subcultures are the ones that don’t even enter into the conversation. They are, by definition, impossible to quantify. They are groups of people whom—right now, in the present tense—it is still acceptable to dislike or discount or ignore. They are groups who are not seen as needing protection or support, which makes them vulnerable to ridicule and attack. Who are they? As already stated in this paragraph, I am in no position to say. If I try, I can only be wrong. Any argument in their favor is an argument against my premise.
  • it’s impossible to generate deep verisimilitude without specificity.
  • A book becomes popular because of its text, but it’s the subtext that makes it live forever.
  • I’m not saying an important book must include one of these ideas, or even an idea that would comfortably fit on this list. But it needs to include something that taps into what matters about the world now. There has to be something at stake that involves modernity.
  • When any novel is rediscovered and culturally elevated, part of the process is creative: The adoptive generation needs to be able to decide for themselves what the deeper theme is, and it needs to be something that wasn’t widely recognized by the preceding generation.
  • The defining 9/11 novel may end up being Infinite Jest, even though it was written five years before the actual event and has very little to do with New York or terrorism or global politics.
  • The reason so many well-considered ideas appear laughable in retrospect is that people involuntarily assume that whatever we believe and prioritize now will continue to be believed and prioritized later, even though that almost never happens.
  • But I’ve been a paid critic for enough years to know my profession regularly overrates many, many things by automatically classifying them as potentially underrated. The two terms have become nonsensically interchangeable.
  • I’m more concerned with the unrated, and particularly things that are unrated on purpose.
  • someone like William T. Vollmann straddles both lines, fortified by his sublime recklessness.
  • The upside to this experience is that the writers become rich enough to write forever, in whatever way they choose. The downside of this experience is that the rest of those writers’ careers are viewed through the prism of their singular super-success.
  • Which brings us to the final tier: the “quietly unrated.” This is the level encompassing the vast majority of American writers. The reality of publishing is that most books just come out. They are written, edited, marketed, and publicized—but nothing else happens.
  • Ask anyone reading Anna Karenina in the present day what they think of the story, and they will often mention how surprisingly contemporary it seems. That would suggest the 1877 age of Tolstoy is essentially similar to the age of today, and that the only antiquated details are the details that don’t matter.
  • I think the social difference between 2016 and 2155 will be significantly more profound than the social difference between 1877 and 2016,
  • This acceleration is real, and it will be harder and harder for future generations to relate to “old” books in the way they were originally intended.
  • Instead of fitting the present (past) into the future, we will jam the present (future) into the present (past).
  • The metaphysical conception of “rock” cuts such a wide swath that it even includes subgenres that can be applied with equal ubiquity, like punk and metal and (until the mid-nineties) hip-hop. The defining music of the first half of the twentieth century was jazz; the defining music of the second half of the twentieth century was rock, but with an ideology and saturation far more pervasive. Only television surpasses its influence.
  • There was virtually no way a man born in 1920 would (or could) share the same musical taste as his son born in 1955, even if they had identical personalities.
  • “Rock” can now signify anything, so it really signifies nothing; it’s more present, but less essential.
  • By now, it’s almost impossible to create a new rock song that doesn’t vaguely resemble an old rock song. So what we have is a youth-oriented musical genre that (a) isn’t symbolically important, (b) lacks creative potentiality, and (c) has no specific tie to young people. It has completed its historical trajectory.
  • As the timeline moves forward, tangential artists in any genre fade from the collective radar, until only one person remains; the significance of that individual is then exaggerated, until the genre and the person become interchangeable.
  • The Beatles were the first major band to write their own songs, thus making songwriting a prerequisite for credibility;
  • It’s the only major art form where the opinion of a random fourteen-year-old is considered more relevant than the analysis of a sixty-four-year-old scholar.
  • It’s hard to explain how Nirvana’s “Smells Like Teen Spirit” was unable to climb higher than number six on the Billboard Hot 100 chart, despite being viewed (almost from its media inception) as the defining song of its era.
  • Three or four generations from now, the present-day entertainment medium most likely to be “studied” by cultural historians will be television, based on the belief that TV finally became a serious, meaningful art form around the turn of the twenty-first century.
  • As I write this sentence, the social stature of Elvis and Dylan feels similar—perhaps even identical. But it’s entirely possible that one of these people will get dropped as time plods forward. And if that happens, the consequence will be huge. If we concede that the “hero’s journey” is the de facto story through which we understand history, the differences between these two heroes would profoundly alter the description of what rock music supposedly was.
  • Louis Armstrong didn’t sell as many records as Ben Selvin in the 1920s, but he has retained his fame because he’s been championed by critics, historians and later musicians.
  • Music critics have almost no impact on what music is popular at any given time, but they’re extraordinarily well positioned to dictate what music is reintroduced after its popularity has waned.
  • The likelihood that anyone in the universe will play this record is only slightly greater than the likelihood that my dad will play a Kendrick Lamar album, and my dad is dead.
  • The work has to be good enough to enter the critical conversation, whatever that conversation happens to be. But once something is safely inside the walls of that discussion, the relative merits of its content matters much less.
  • In 1936, a quarterly magazine called The Colophon polled its subscribers (of whom there were roughly two thousand, although who knows how many actually voted) about what contemporary writers they believed would be viewed as canonical at the turn of the twenty-first century. The winner was Sinclair Lewis, who had won the Nobel Prize for literature just five years earlier. Others on the list include Willa Cather, Eugene O’Neill, George Santayana, and Robert Frost. It’s a decent overview of the period. Of course, what’s more fascinating is who was left off: James Joyce, F. Scott Fitzgerald, and Ernest Hemingway
  • To matter forever, you need to matter to those who don’t care. And if that strikes you as sad, be sad.
  • Even though every concrete signifier suggests my understanding of rock music is airtight and stable, I live my life with an omnipresent sensation of low-level anxiety about all the things I don’t know about music.
  • Take, for example, the childhood question of why the sky is blue. This was another problem tackled by Aristotle. In his systematic essay “On Colors,” Aristotle came up with an explanation for why the sky is blue: He argued that all air is very slightly blue, but that this blueness isn’t perceptible to the human eye unless there are many, many layers of air placed on top of each other (similar, according to his logic, to the way a teaspoon of water looks clear but a deep well of water looks black). Based on nothing beyond his own powers of deduction, it was a genius conclusion. It explains why the sky is blue. But the assumption was totally wrong. The sky is blue because of the way sunlight is refracted. And unlike Aristotle, the person who realized this truth didn’t care why it was true, which allowed him to be right forever. There will never be a new explanation for why the sky is blue.
  • There’s something a little insulting about the term “normal science,” in the same way it’s insulting to describe a woman’s outfit as “basic.”
  • Like most people who enjoy dark rooms and Sleep’s Jerusalem, I dig the simulation argument. It is, as far as I can tell, the most reasonable scientific proposition no one completely believes. I
  • There’s always an entrenched psychological hurdle with this hypothesis—it’s just impossible for any person to circumvent the sense that what appears to be happening is really happening, and that the combination of strangeness and comfort within this experience makes the sensation of “being alive” too uncanny to be anything but genuine. But this sensation can’t be trusted (in fact, it might be baked into the simulation). And what’s most compelling about this concept is how rational it starts to seem, the longer you think about it.
  • At the time, Greene was discussing a collection of (roughly) twenty numbers that seem to dictate how the universe works. These are constants like “the mass of an electron” and “the strength of gravity,” all of which have been precisely measured and never change. These twenty numbers appear inconceivably fine-tuned—in fact, if these numbers didn’t have the exact value that they do, nothing in the universe would exist. They are so perfect that it almost appears as if someone set these numbers. But who could have done that? Some people would say God. But the simulation hypothesis presents a secular answer: that these numbers were set by the simulator.
  • But you can’t say, “I have a conspiracy theory.” Because if you do, it will be assumed that even you don’t entirely believe the conspiracy you’re theorizing about.
  • There’s also the question of motive: Fomenko’s revisionist timeline places the center of all “real history” inside Russia, which is probably why the only people who take it seriously are Russian (most notably grandmaster chess champion Garry Kasparov, who wrote a long essay in support of the theory titled “Mathematics of the Past”).
  • But it still must be asked: Discounting those events that occurred within your own lifetime, what do you know about human history that was not communicated to you by someone else?
  • When D. T. Max published his posthumous biography of David Foster Wallace, it was depressing to discover that many of the most memorable, electrifying anecdotes from Wallace’s nonfiction were total fabrications.
  • There’s a game I like to play with people when we’re at the bar, especially if they’re educated and drunk. The game has no name, but the rules are simple: The player tries to answer as many of the following questions as possible, without getting one wrong, without using the same answer twice, and without looking at a phone. The first question is, “Name any historical figure who was alive in the twenty-first century.” (No one has ever gotten this one wrong.) The second question is, “Name any historical figure who was alive in the twentieth century.” (No one has ever gotten this one wrong, either.) The third question is, “Name any historical figure who was alive in the nineteenth century.” The fourth question is, “Name any historical figure who was alive in the eighteenth century.” You continue moving backward through time, in centurial increments, until the player fails. It’s mildly shocking how often highly intelligent people can’t get past the sixteenth century; if they make it down to the twelfth century, it usually means they either know a lot about explorers or a shitload about popes.
  • So apply this philosophy to ourselves, and to our own version of televised culture: If we consider all possible criteria, what were the most accidentally realistic TV shows of all time? Which American TV programs—if watched by a curious person in a distant future—would latently represent how day-to-day American society actually was?
  • The unspoken goal of Mad Men was to depict how the sixties “really” were. And to the present-day Mad Men viewer, that’s precisely how the show came across. The goal was achieved. But Mad Men defines the difference between ancillary verisimilitude and premeditated reconstruction. Mad Men cannot show us what life was like in the sixties. Mad Men can only show how life in the sixties came to be interpreted in the twenty-first century.
  • It took decades for screenwriters to realize that no adults have ever walked into a tavern and said, “I’ll have a beer,” without noting what specific brand of beer they wanted
  • When the rom-com series Catastrophe debuted on Amazon, a close friend tried to explain why the program seemed unusually true to him. “This is the first show I can ever remember,” he said, “where the characters laugh at each other’s jokes in a non-obnoxious way.” This seemingly simple idea was, in fact, pretty novel—prior to Catastrophe, individuals on sitcoms constantly made hilarious remarks that no one seemed to notice were hilarious.
  • The protagonist in Entourage was supposed to be a version of Entourage producer Mark Wahlberg, had Wahlberg experienced Leonardo DiCaprio’s career.
  • Part of the pleasure these programs provide is an opportunity to make these Xerox associations—and once the connections calcify in viewers’ heads, they can effortlessly inject living public figures into fake story lines.
  • But as a reality hunter with a reality hunger, my thinking occupies the dark years in between. Throughout the 1970s and ’80s, watching TV was just what people did when there was nothing else to do. The idea of “appointment television” would have been considered absurd—if you missed a show, you missed it. It was not something to worry about.
  • Roseanne was the first American TV show comfortable with the statistical reality that most Americans are fat. And it placed these fat people in a messy house, with most of the key interpersonal conversations happening in the kitchen or the garage or the laundry room.
  • Football is so popular that people (myself included) have private conversations about how many people would have to die on the field before we’d seriously consider giving it up.
  • Its fanbase resembles that of contemporary boxing—rich people watching poor people play a game they would never play themselves.
  • The answer to this question is both obvious and depressing: Something becomes truly popular when it becomes interesting to those who don’t particularly care.
  • During the first week of 2015, I interviewed Los Angeles Lakers guard Kobe Bryant in a pancake house. It was a short conversation, but we covered a lot of ground—his lack of close friends, the rape accusations levied against him in 2003, his self-perceived similarities to Mozart. It was the best interview experience I’d ever had with an athlete. At one point, we were talking about film, and I asked Kobe if he’d seen the movie Whiplash. “Of course,” he said. “That’s me.” The trajectory of the conversation switched after he said this, so I was never able to ask a follow-up; I was never able to ask if he meant that he saw himself as the film’s protagonist, its antagonist, or a human incarnation of the entire movie. All three possibilities seemed plausible.
  • It’s wholly possible that the nature of electronic gaming has instilled an expectation of success in young people that makes physical sports less desirable.
  • Ohio is a wonderful place to ponder the state of American democracy, because you’re constantly being reminded that America is where you are. Ohio is a scale model of the entire country, jammed into 43,000 square miles.
  • Obviously, no one thinks like this now. In fact, they don’t even think that this was how they thought at the time: Huge swaths of the populace have retroactively convinced themselves that their reaction to the 2000 election was far more extreme than all evidence suggests. When Bush took office in January, it felt perfunctory. That September, the world changed completely. America adopted a level of political polarization that had not existed since the Reconstruction, which now feels like the normal way to think about society.
  • This logic leads to a strange question: If and when the United States does ultimately collapse, will that breakdown be a consequence of the Constitution itself? If it can be reasonably argued that it’s impossible to create a document that can withstand the evolution of any society for five hundred or a thousand or five thousand years, doesn’t that mean present-day America’s pathological adherence to the document we happened to inherit will eventually wreck everything?
  • I think it’s more likely that if we look back with regret at our dedication to the Constitution, it will be with respect to the structural provisions, rather than the liberty and equality ones.
  • The Declaration of Independence predates the Constitution by eleven years and doesn’t have any legislative power. Still, it’s central to everything we think about the US, particularly one sentence from its second paragraph that many Americans assume is actually in the Constitution itself: “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty, and the Pursuit of Happiness.”
  • It’s not merely that Obama was the first black president. It’s that he broke this barrier with such deftness and sagacity that it instantaneously seemed insane no black person had ever been elected president before. In fact, he broke the barrier so fluidly that a few of the polled historians suggested his blackness will eventually be a footnote to his presidency, in the same way that John F. Kennedy’s Catholicism has become a factoid referenced only by Catholics.
  • Whether it’s Avengers: Age of Ultron, The Matrix, the entire Terminator franchise, or even a film as technologically primitive as War Games, a predictable theme inexorably emerges: The moment machines become self-aware, they will try to destroy people. What’s latently disturbing about this plot device is the cynicism of the logic.