Too Smart for Your Own Good

Beliefs (or lack thereof)

fairy-stareWhat is religion? It’s community, history, cultural identity; it’s a way to make sense of the world. Some of us use religion as our primary source of answers; others use mysticism; others: science; and still others use a combination of methods. This is not an argument for one way over the others, but rather a look at beliefs through several different lenses:

We all try to make sense of the world. Our methods may differ, but we are all seeking to understand the same universe:

People find themselves in a mysterious and mysteriously ordered universe. They find themselves equipped with sort of intense moral instincts.  They have religious experiences, and they develop systems that explain those.

—Ross Douthat, author of Bad Religionspeaking on Real Time with Bill Maher (2012)

Religion plays a comforting role for some people:

[P]eople who lose personal control take comfort in religion, because it suggests to them that the world is under God’s control and, therefore, predictable and nonrandom.

—Zuckerman et al., “The Relation Between Intelligence and Religiosity” (2013)

But the comfort of belief is not confined to religious doctrine:

Human beings are believing animals, period. […] Even secular liberals have their [beliefs…] What is the idea of universal human rights if not a metaphysical principle? Can you find universal human rights under a microscope? Is it in the laws of physics?”

—Ross Douthat, author of Bad Religion, speaking on Real Time with Bill Maher (2012)

It is becoming more difficult to understand a world defined by modern technology:

The modern technologies of the day are a bit of a black box for the average person. […] the average person I meet on the street doesn’t feel any kind of connection to the technologies that are defining their world and shaping the fabric of society.

—Steve Jurveston, billionaire tech investor, “Acclerating Rich-Poor Gap,” Solve for X (2013)

But humans are resilient creatures: Make the world incomprehensible, and they will find a way to comprehend it:

The explosion in communication technologies over the past decades has re-oriented society and put more psychological strain on us all to find our identities and meaning. For some people, the way to ease this strain is to actually reject complexity  and ambiguity for absolutist beliefs and traditional ideals.

—Mark Manson, author, “The Rise of Fundamentalist Belief” (2013)

While we are getting better at adapting to the ever-changing world…:

“The rate at which we can adapt is increasing,” said Teller. “A thousand years ago, it probably would have taken two or three generations to adapt to something new.” By 1900, the time it took to adapt got down to one generation. “We might be so adaptable now,” said Teller, “that it only takes ten to fifteen years to get used to something new.”

—Thomas Friedman quoting Astro Teller, CEO of Google’s X Research & Development Laboratory

…the world is changing at an ever-increasing rate:

An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense “intuitive linear” view. So we won’t experience 100 years of progress in the 21st century — it will be more like 20,000 years of progress (at today’s rate).

—Ray Kurzwell, computer scientist,“The Law of Accelerating Returns” (2001)

Technology may soon make everyone feel as if they’ve lost control:

“If the technology platform for society can now turn over in five to seven years, but it takes ten to fifteen years to adapt to it,” Teller explained, “we will all feel out of control, because we can’t adapt to the world as fast as its changing.
—Astro Teller, CEO of Google X, quoted in Thank You for Being Late (2016)
image

And while some people think the world is still divided into those who “understand” and those who don’t…:

[Disconnection from technology is] a different kind of estrangement. It’s almost like a cognitive separation—those who know and those who don’t know about the world they live in.

—Steve Jurveston, billionaire tech investor, “Acclerating Rich-Poor Gap,” Solve for X (2013)

…technological advancement will eventually humble even our brightest minds:

“None of us have the capacity to deeply comprehend more than one of these fields [genomic cloning, medical robotics, artificial intelligence]—the sum of human knowledge has far outstripped any individual’s capacity to learn—and even experts in these fields can’t predict what will happen in the next decade or century.”

—Astro Teller, CEO of Google X, quoted in Thank You for Being Late (2016)

For, at the end of the day, we are all humans (no matter how intelligent). And that is the thesis of this entire “Too Smart for Your Own Good” series:

You are never too smart to be humble.

Intelligence does not make a person immune to faulty logicinsensitivity, poor timing, or technological change. There are biological limitations to being human. However we choose to make sense of the world—whatever strategies for life we employ—we are making a personal choice. So remember: “Judge not, lest ye be judged.”


Additional Reading:

Urban, Tim. “Religion for the Nonreligious.” waitbutwhy.com (2014): (Source)

Sharov, Alexei A., and Richard Gordon. “Life before earth.” arXiv preprint arXiv:1304.3381 (2013). (Source) (Summary)

Zuckerman, Miron, Jordan Silberman, and Judith A. Hall. “The Relation Between Intelligence and Religiosity: A meta-analysis and some proposed explanations.” Personality and Social Psychology Review 17.4 (2013): 325-354. (Source)

This is the fifth installment of a series titled “Too Smart  for Your Own Good.

Standard
Too Smart for Your Own Good

The Benefits of Doubt

big-head

This is a story of two brainiacs—both pioneers in their chosen fields, both knowingly intelligent, and both recipients of the Nobel Prize. Despite their similarities, the stories of William Shockley—and his eventual demise—and Daniel Kahneman’s ultimate redemption differ in each genius’s relationship with self-doubt. While doubt is often cast as the villain of our lives, this is a different tale, one where doubt is the protagonist, the saving grace. This is a story of two men whose success and failure hinged on the benefits of doubt.

William Shockley grew up in California during the 1920s, receiving a B.S. from California Institute of Technology in 1932. Quantum physics was still a fresh discovery, and Shockley reportedly, “absorbed most of it with astonishing ease.” In 1936, he earned his Ph.D. from M.I.T. and began his innovative career at Bell Laboratories, one of the most prolific R&D enterprises in the world. As a research director at Bell, Shockley helped invent the transistor (as in “transistor radio”), for which he received the Nobel Prize in Physics in 1956—an achievement that would lead to his eventual undoing.

That same year, Shockley moved back to Mountain View, California to start Shockley Semiconductor Laboratory, the first technology company in what would become Silicon Valley. As a researcher and a manager, William Shockley was brilliant, innovative, but domineering. His arrogance at Shockley Labs was a point of contention among the employees and a problem that would only be exacerbated after the Nobel Prize. Winning the prize seemed to eradicate any residual self-doubt left from his youth. Shockley became deaf to outside opinion, blind to reason, and unforgivingly egotistical. In 1957, less than a year after becoming a Nobel Laureate, eight of Shockley’s best and brightest left the company to form Fairchild Semiconductor.

The Shockley schism was in large part due to William Shockley’s unwillingness to continue research into silicon-based semiconductors despite his employees’ belief that silicon would be the future. His hubris would be his end, as Shockley Labs never recovered from missing the silicon boom. The cohort who left would later create numerous technology firms in Silicon Valley (including the tech giant, Intel), giving birth to what has become one of the most innovative regions in the world.

William Shockley, himself, faded into obscurity, estranged from his children, his reputation tarnished after years of public touting of eugenics. Along with Henry Ford, Pablo Picasso, Albert Einstein, and Mahatma Gandhi, Time Magazine would name William Shockley as one of the “100 Most Important People of the 20th Century.” Yet he died bitter and disgraced in 1989, still headstrong, self-doubt still absent from his life. For Shockley, arrogance and intellect seemed to be inextricably linked; however, such a fate is not inevitable.

Daniel Kahneman’s story is different. It’s one where self-aware intellect meets a healthy dose of self-doubt. Born in 1936, Kahneman grew up a Jew in France during the Nazi occupation. His family was displaced several times and his experiences with Nazis would show him firsthand the complexities and peculiarities of the human mind—a foreshadowing of his illustrious career in psychology.

As an adolescent after World War II, Kahneman moved to Palestine where in eighth-grade he finally found like-minded friends. Acknowledging both his intellect and its dangers, he writes, “It was good for me not be exceptional anymore.” While Kahneman knew he was smart, he always saw his deficits (and sometimes to a fault). As Michael Lewis, author of Moneyball and The Undoing Project, puts it, “Everything [Daniel Kahneman] thinks is interesting. He just doesn’t believe it.” And this self-doubt would lead directly to his success.

After getting degrees from Hebrew University and U.C. Berkley, Kahneman’s work in psychology would take off after his first collaboration with Amos Tversky. Between 1971 and 1981, Tversky and Kahneman would publish five journal articles that would be cited over 1,000 times. Their work in cognitive biases—largely fueled by the ability to doubt their own minds—has been instrumental in upending the long-standing belief that humans are purely rational creatures.

In 2002, Kahneman was awarded the Nobel Prize in Economics for showing that even the brightest among us make mental mistakes everyday—prospect theory, cognitive biases and heuristics. Like William Shockley, winning the Nobel Prize would be a turning point in Kahneman’s life; unlike Shockley, the prize made Kahneman better not worse. Michael Lewis again comments, “[T]he person who we know post-Nobel Prize is entirely different from the person who got the Nobel Prize […] he is much less gloomy, much less consumed with doubts.” In a stroke of irony, it would take what is arguably the most prestigious award in the world to finally provide Kahneman with the validation that his thoughts on doubt are worthwhile.

Kahneman’s career has been largely focused on the benefits of doubt—the idea that humans may be mistaken in our confidence and intuition, and need to question our assuredness. While his legacy is still to be seen, Kahneman’s work may prove to change how humans think about thinking forever. His impact may even put him on Time’s list of “100 Most Important People of the 21st Century.”

The stories of Shockley and Kahneman serve as both a warning and a call to action. Doubt should not be seen as a curse, but rather a necessity—a blessing, even. It is when we are not in doubt that we should take notice. Arrogance at any level can blind us. We don’t have to be a savant to be egotistical or have a “big head” (yet the smartest among us are often guilty of this). Only through our doubts can we expect to learn from others, question our assumptions, and, ultimately, be successful long-term. As soon as we stop doubting, stop questioning, we stop growing.

This is the fourth installment of a series titled “Too Smart  for Your Own Good.”

Standard
Too Smart for Your Own Good

Ignorance Is Bliss-ish

ignorance-is-blissLife is like pouring concrete. (Bear with me here.) The world provides an endless supply of mystery—raw concrete mix—and over time this concrete pours out into our lives, moving from the unknown (concrete mixer) to the known (exposed, wet concrete). We learn new things, have new experiences, and make new discoveries. In this process, we shape our new knowledge into a unique worldview—a concrete foundation. And just like concrete, our worldviews harden over time.

Children are exposed to so much newness that the concrete flows out like a hole in a dam. They mold the onslaught of new information as well as they can, trying to make room for the next layers. As we get older, however, the concrete that once flowed freely begins to slow. For adults, finding new knowledge comes with a cost—our time and energy. Building upon our worldview foundation requires actively seeking the unknown—journeying to the land of new information at the risk (or benefit) of altering our concrete structure.

There are only two ways to change a concrete worldview: Addition or subtraction; either we (a.) add more concrete—more knowledge—that can be molded around the existing structure, or (b.) we take a jack-hammer to our concrete edifice in an attempt to reconfigure its appearance. The former option is more delicate, incremental, and self-directed. The latter is less controlled, more dramatic, and often occurs by outside influence. Experiences that fly in the face of our worldview are processed in different ways by different people. And while the truth has a way of eroding our most egregious misconceptions, some people’s structures are more protected than others.

The world is essentially made up of three types of people: 1. Never Quitters, 2. Bitter Quitters, and 3. Happy Quitters. This is to say that there are people who persist, people who give up in discouragement, and people who are content watching rather than participating. Let’s examine each of these three groups using a totally nonscientific nor rigorous methodology—the clichés that they most represent.

Never Quitters are the persistent, “lifelong learners” who continually pour and shape concrete onto their worldview foundation. They are so named for the motivational cliché, “Never give up.” Bitter Quitters are ex-Never Quitters who, in their search of the unknown, stumbled upon an undesirable reality and decided to go no further. This group is summarized by the phrase, “Better the devil you know, than the devil you don’t.” And the final group—the Happy Quitters—may know of a world beyond their bubble, but choose not to explore it, because… well… why would they? They’re content with their lives. Theirs is a comfortable bubble, like being tucked under warm, cozy blankets in a frigid bedroom. And for this group, “Ignorance is bliss.

In a world where, as YaleGlobal Online reports, “the gap between job requirements and available skills is widening,” and technological advancement is outpacing society’s ability to keep up, it seems “obvious” to the Never Quitters that continual learning is imperative. To a Never Quitter, bowing out of the world of learning seems like an admission of defeat. “Now, more than ever, is a time for continual learning, up-skilling, and growing to stay relevant in an ever-changing landscape,” the Never Quitter may say. But they are missing a crucial point.

There is value in the known, the familiar, and the comfortable (especially if such a worldview has a solid foundation). The known allows for a state of low-anxiety and contentment. Also, we all become either a Bitter or Happy Quitter eventually. The 97-year-old lifelong learner may decide that he’s perfectly content not understanding how Twitter works. It’s a common mistake of the “Too Smart for Your Own Good” cohort to think that everyone would benefit from journeying down the Path of the Mental Unknown. They fail to acknowledge the personal benefits of a comfortable, insulated worldview, which Quitters regularly enjoy. In order to coexist, people are not required to agree on how best to live life, but we do need to understand one another’s perspective. How each of us shapes the concrete that life provides is completely up to us.concrete-life

This is the third installment of a series titled “Too Smart  for Your Own Good.”


For more information on lifelong learners in America, consider reading the following report: Pew Research Center, March, 2016, “Lifelong Learning and Technology.” (Source)

Standard
Too Smart for Your Own Good

The Problem with Logic

tug-o-war2There is an art to getting things done. The problem with logic is that it’s only half of the equation (and “half” is being generous). Logic can lead to a false sense of understanding. “This is the logical explanation,” we may say. There’s a logical choice, a logical path, a “right way.” But logical thinking often neglects to consider the other half of the equation—the emotional half. And emotions are a much more powerful force.

A considerable amount of what occurs in the world occurs because of emotion alone (without even the slightest consideration for reason). Forgetting about the emotional side can handicap us in the real world of “getting things done.” An idea may be the most logical, but that does not mean it will always triumph. The real world is not a meritocracy; just ask an economist.

In recent years, economists have recommended that NFL teams spend less resources trying to draft superstar athletes. Instead, they recommend trading top draft picks for multiple lower picks (spreading their risk of picking a top round dud). Despite showing that such changes would translate into more wins per season (about 1.5 more), team owners and general managers have generally ignored the advice. Why? Because there is more to life than logic. Teams are illogically overconfident that their picks are better than the competitions’ and owners enjoy having big name players on the roster. It’s not logical; it’s emotional. Historical precedent, personal ideologies, social allegiances—these are what constitute the illogical side of life, the emotional side. Economists work in the world of logical suggestions, but the real power lies in the hands of others.

My favorite definition of “power” is “the ability to get things done.” It avoids the negative connotations we have. There is no mention to coercion or corruption, no distinction between strong and weak, no manipulation tactics or financial sanctions. Power can just as easily relate to a healthy marriage as it can to a government regime. We are required to “get things done” everyday, and our ability to do so benefits not only us, but our friends, family, and colleagues. Power, therefore, is something we should wish on everyone.

To “empower” a person is to bestow them with the ability to help himself. We live in a world where emotions reign supreme. Events often happen on emotion alone (without logical consideration), but rarely the other way around. Logic, then, is the empowering piece of the puzzle; it is the prerequisite for significant improvement. But unlike the power of emotional whims, we cannot achieve with logic alone.

Good leadership, parenting, governance—they all require emotional intelligence, as well as critical thinking. This is the cornerstone underlying television shows like Hugh Laurie’s House M.D. and Benedict Cumberbatch’s Sherlock—the lone genius struggling in a world of social interaction and emotions, knowing these inevitabilities are as much a solution as they are a problem, a strength and a weakness. Destruction is easy with emotions alone, but building something meaningful requires both heart and mind.

Last week we discussed the frustrations of the workplace—the unspoken, sometimes unknowable rules (the emotional side). But more frustrating than office politics is actual politics. It can be ugly at times, but politics is the ultimate “art” of getting things done, where the complexities of social interactions far outweigh logical idealism or truth. People who vote with their heads struggle with this concept. They cannot understand voting with the heart. And that’s the classic downfall of a logic-heavy worldview: It leads to the dismissal of the human condition and the social/emotional sides to us all.

There’s an old journalism maxim, “if it bleeds, it leads,” which sums up the emotion-heavy side of society. Emotional hyperbole is popular; it sells news. We like stories about people (human interest pieces), and if there happens to be a murder involved, all the better! Humans are drawn to other humans, often in ways that defy reason. A string of bad relationship choices, the flock of rioting fans after a win—none of it makes logical sense, but it happens nonetheless. We must attempt to see the forest for the trees.

Understanding how society works, with all its human behavioral quirks, can buffer the logic problem. There is a scene in the 2000 historical epic, Gladiator, where Oliver Reed’s character—a wise, embattled ex-gladiator—gives some advice to the young warrior, Russell Crowe. He says to Crowe, “I was not the best because I killed quickly. I was the best because the crowd loved me.” Oliver Reed’s gladiator understood how the system actually worked, and that understanding (not his skill in the colosseum) is what set him apart. Logic is necessary to our success, but it must be balanced with the understanding, compassion, and social skills that human emotions provide. Such a balancing act is a real art form.

This is the second installment of a series titled “Too Smart  for Your Own Good.”

Standard
Too Smart for Your Own Good

Being Right at the Wrong Time

one-way

Being right can be embarrassing. Maybe not today, maybe not tomorrow, but at some point we realize that saying that thing or correcting that person was not a good idea. We realize that we were wrong in our rightness. There is a right time and a right place for correctness, but knowing when not to speak is the key. And where does this principle seem to cause the most confusion? A place where we’re told that accuracy and timeliness are top priorities—the workplace.

Office environments are the perfect setting to discuss the potential woes of being right. In 2014 Lauris Beinerts’ parody video “The Expert” made waves across the Internet with its cheeky look at the stereotypical office idiot. The short film takes place in a nondescript conference room where Anderson, “the expert,” attempts to explain to his four colleagues why the task at hand—drawing seven red lines, all perpendicular to one another and some with green ink—is fundamentally impossible. As the meeting unfolds, it becomes clear that only Anderson sees the logical discrepancies of the absurd request. Everyone else? Idiots.

The popularity of Beinert’s video is due to its ubiquity. We can all relate to the frustration of Anderson-The-Expert. We have all thought during a meeting, “Why are we still talking about this? Isn’t it obvious?” But maybe there are other times, times that, without realizing it, we are the office dunces. Rather than lambasting the bumbling co-worker, we should consider the flaws of being right. Perhaps Anderson is actually the office idiot.

Once his co-worker says something obviously wrong, Anderson has a decision to make—say something or stay quiet. If the point of the meeting is to correct misconceptions, then correct away! That’s being right at the right place and time. However, meetings are often called upon for one reason when, in fact, they have a different, unspoken purpose. This is where the confusion comes in.

It’s important to align the purpose for being right with the overall purpose of the situation. Anderson might think he’s being helpful in his explanation that red ink is not, in fact, green ink; however, if there are more pressing matters to attend, then perhaps Anderson’s comments are wasting productive time. Or maybe Anderson’s boss knows full well that the details of the project are irrelevant for the final product, hoping Anderson will simply agree and let the matter rest. Whatever the case may be, just having correct information does not necessitate sharing it.

Of course, for jobs dealing with life or death (ex. structural engineering, law enforcement, surgery, etc.), being right is crucial. There’s no time to waste wondering whether the truth should or shouldn’t be said. For most jobs, however, we don’t have those kinds of restrictions. And for those jobs, timing is everything.

Whether it’s launching a product, delivering a joke, meeting a soulmate, or buying a house, timing can make or break a situation. Being right is no different. There’s an internet meme with Jeff Bridges sitting back as The Dude from The Big Lebowski, which reads, “You’re not wrong / You’re just an asshole.” This strikes at the heart the timing issue. We must know when to say something and when to shut up.

For my wedding, my father gave me a newspaper clipping with some marital advice. It read, “Before you speak, ask yourself, ‘Is it true? Is it kind? Is it necessary?’” Being right is only the first part. Knowing when to be silent is much more powerful, more wise, than anything we may have to say.

This is the first installment of a series titled “Too Smart  for Your Own Good.”

Standard