RULE I: Do not carelessly denigrate social institutions or creative achievement.
Main theme: Psychology Important point: We must perceive and act in a manner that meets our biological and psychological needs—but, since none of us lives or can live in isolation, we must meet them in a manner approved of by others. This means that the solutions we apply to our fundamental biological problems must also be acceptable and implementable socially From the subtopic: What should we point to?The Utility of the Fool
It is useful to take your place at the bottom of a hierarchy. It can aid in the development of gratitude and humility. Gratitude: There are people whose expertise exceeds your own, and you should be wisely pleased about that. There are many valuable niches to fill, given the many complex and serious problems we must solve. The fact that there are people who fill those niches with trustworthy skill and experience is something for which to be truly thankful. Humility: It is better to presume ignorance and invite learning than to assume sufficient knowledge and risk the consequent blindness. It is much better to make friends with what you do not know than with what you do know, as there is an infinite supply of the former but a finite stock of the latter. When you are tightly boxed in or cornered—all too often by your own stubborn and fixed adherence to some unconsciously worshipped assumptions—all there is to help you is what you have not yet learned. It is necessary and helpful to be, and in some ways to remain, a beginner. For this reason, the Tarot deck beloved by intuitives, romantics, fortune-tellers, and scoundrels alike contains within it the Fool as a positive card, an illustrated variant of which opens this chapter. The Fool is a young, handsome man, eyes lifted upward, journeying in the mountains, sun shining brightly upon him—about to carelessly step over a cliff (or is he?). His strength, however, is precisely his willingness to risk such a drop; to risk being once again at the bottom. No one unwilling to be a foolish beginner can learn. It was for this reason, among others, that Carl Jung regarded the Fool as the archetypal precursor to the figure of the equally archetypal Redeemer, the perfected individual. The beginner, the fool, is continually required to be patient and tolerant—with himself and, equally, with others. His displays of ignorance, inexperience, and lack of skill may still sometimes be rightly attributed to irresponsibility and condemned, justly, by others. But the insufficiency of the fool is often better regarded as an inevitable consequence of each individual’s essential vulnerability, rather than as a true moral failing. Much that is great starts small, ignorant, and useless. This lesson permeates popular as well as classical or traditional culture. Consider, for example, the Disney heroes Pinocchio and Simba, as well as J. K. Rowling’s magical Harry Potter. Pinocchio begins as a woodenheaded marionette, the puppet of everyone’s decisions but his own. The Lion King has his origin as a naive cub, the unwitting pawn of a treacherous and malevolent uncle. The student of wizarding is an unloved orphan, with a dusty cupboard for a bedroom, and Voldemort—who might as well be Satan himself—for his archenemy. Great mythologized heroes often come into the world, likewise, in the most meager of circumstances (as the child of an Israelite slave, for example, or newborn in a lowly manger) and in great danger (consider the Pharaoh’s decision to slay all the firstborn male babies of the Israelites, and Herod’s comparable edict, much later). But today’s beginner is tomorrow’s master. Thus, it is necessary even for the most accomplished (but who wishes to accomplish still more) to retain identification with the as yet unsuccessful; to appreciate the striving toward competence; to carefully and with true humility subordinate him or herself to the current game; and to develop the knowledge, self-control, and discipline necessary to make the next move. I visited a restaurant in Toronto with my wife, son, and daughter while writing this. As I made my way to my party’s table, a young waiter asked if he might say a few words to me. He told me that he had been watching my videos, listening to my podcasts, and reading my book, and that he had, in consequence, changed his attitude toward his comparatively lower-status (but still useful and necessary) job. He had ceased criticizing what he was doing or himself for doing it, deciding instead to be grateful and seek out whatever opportunities presented themselves right there before him. He made up his mind to become more diligent and reliable and to see what would happen if he worked as hard at it as he could. He told me, with an uncontrived smile, that he had been promoted three times in six months. The young man had come to realize that every place he might find himself in had more potential than he might first see (particularly when his vision was impaired by the resentment and cynicism he felt from being near the bottom). After all, it is not as if a restaurant is a simple place—and this was part of an extensive national organization, a large, high-quality chain. To do a good job in such a place, servers must get along with the cooks, who are by universal recognition a formidably troublesome and tricky lot. They must also be polite and engaging with customers. They have to pay attention constantly. They must adjust to highly varying workloads—the rushes and dead times that inevitably accompany the life of a server. They have to show up on time, sober and awake. They must treat their superiors with the proper respect and do the same for those—such as the dishwashers—below them in the structure of authority. And if they do all these things, and happen to be working in a functional institution, they will soon render themselves difficult to replace. Customers, colleagues, and superiors alike will begin to react to them in an increasingly positive manner. Doors that would otherwise remain closed to them—even invisible—will be opened. Furthermore, the skills they acquire will prove eminently portable, whether they continue to rise in the hierarchy of restaurateurs, decide instead to further their education, or change their career trajectory completely (in which case they will leave with laudatory praise from their previous employers and vastly increased chances of discovering the next opportunity). As might be expected, the young man who had something to say to me was thrilled with what had happened to him. His status concerns had been solidly and realistically addressed by his rapid career advance, and the additional money he was making did not hurt, either. He had accepted, and therefore transcended, his role as a beginner. He had ceased being casually cynical about the place he occupied in the world and the people who surrounded him, and accepted the structure and the position he was offered. He started to see possibility and opportunity, where before he was blinded, essentially, by his pride. He stopped denigrating the social institution he found himself part of and began to play his part properly. And that increment in humility paid off in spades.The Necessity of Equals
It is good to be a beginner, but it is a good of a different sort to be an equal among equals. It is said, with much truth, that genuine communication can take place only between peers. This is because it is very difficult to move information up a hierarchy. Those well positioned (and this is a great danger of moving up) have used their current competence—their cherished opinions, their present knowledge, their current skills—to stake a moral claim to their status. In consequence, they have little motivation to admit to error, to learn or change—and plenty of reason not to. If a subordinate exposes the ignorance of someone with greater status, he risks humiliating that person, questioning the validity of the latter’s claim to influence and status, and revealing him as incompetent, outdated, or false. For this reason, it is very wise to approach your boss, for example, carefully and privately with a problem (and perhaps best to have a solution at hand—and not one proffered too incautiously). Barriers exist to the flow of genuine information down a hierarchy, as well. For example, the resentment people lower in the chain of command might feel about their hypothetically lesser position can make them loath to act productively on information from above—or, in the worst case, can motivate them to work at counterpurposes to what they have learned, out of sheer spite. In addition, those who are inexperienced or less educated, or who newly occupy a subordinate position and therefore lack knowledge of their surroundings, can be more easily influenced by relative position and the exercise of power, instead of quality of argumentation and observation of competence. Peers, by contrast, must in the main be convinced. Their attention must be carefully reciprocated. To be surrounded by peers is to exist in a state of equality, and to manifest the give-and-take necessary to maintain that equality. It is therefore good to be in the middle of a hierarchy. This is partly why friendships are so important, and why they form so early in life. A two-year-old, typically, is self-concerned, although also capable of simple reciprocal actions. The same Scarlett whom I talked about earlier—my granddaughter—would happily hand me one of her favorite stuffed toys, attached to a pacifier, when I asked her to. Then I would hand it, or toss it, back (sometimes she would toss it to me, too—or at least relatively near me). She loved this game. We played it with a spoon, as well—an implement she was just beginning to master. She played the same way with her mother and her grandmother—with anyone who happened to be within playing distance, if she was familiar enough with them not to be shy. This was the beginning of the behaviors that transform themselves into full-fledged sharing among older children. My daughter, Mikhaila, Scarlett’s mother, took her child to the outdoor recreational space on top of their downtown condo a few days before I wrote this. A number of other children were playing there, most of them older, and there were plenty of toys. Scarlett spent her time hoarding as many of the playthings as possible near her mother’s chair, and was distinctly unimpressed if other children came along to purloin one for themselves. She even took a ball directly from another child to add to her collection. This is typical behavior for children two and younger. Their ability to reciprocate, while hardly absent (and able to manifest itself in truly endearing ways), is developmentally limited. By three years of age, however, most children are capable of truly sharing. They can delay gratification long enough to take their turn while playing a game that everyone cannot play simultaneously. They can begin to understand the point of a game played by several people and follow the rules, although they may not be able to give a coherent verbal account of what those rules are. They start to form friendships upon repeated exposure to children with whom they have successfully negotiated reciprocal play relationships. Some of these friendships turn into the first intense relationships that children have outside their family. It is in the context of such relationships, which tend strongly to form between equals in age (or at least equals in developmental stage), that a child learns to bond tightly to a peer and starts to learn how to treat another person properly while requiring the same in return. This mutual bonding is vitally important. A child without at least one special, close friend is much more likely to suffer later psychological problems, whether of the depressive/anxious or antisocial sort, while children with fewer friends are also more likely to be unemployed and unmarried as adults. There is no evidence that the importance of friendship declines in any manner with age.8 All causes of mortality appear to be reduced among adults with high-quality social networks, even when general health status is taken into consideration. This remains true among the elderly in the case of diseases such as hypertension, diabetes, emphysema, and arthritis, and for younger and older adults alike in the case of heart attacks. Interestingly enough, there is some evidence that it is the provision of social support, as much or more than its receipt, that provides these protective benefits (and, somewhat unsurprisingly, that those who give more tend to receive more). Thus, it truly seems that it is better to give than to receive. Peers distribute both the burdens and joys of life. Recently, when my wife, Tammy, and I suffered serious health problems, we were fortunate enough to have family members (my in-laws, sister and brother; my own mother and sister; our children) and close friends stay with us and help for substantial periods of time. They were willing to put their own lives on hold to aid us while we were in crisis. Before that, when my book 12 Rules for Life became a success, and during the extensive speaking tour that followed, Tammy and I were close to people with whom we could share our good fortune. These were friends and family members genuinely pleased with what was happening and following the events of our lives avidly, and who were willing to discuss what could have been the overwhelming public response. This greatly heightened the significance and meaning of everything we were doing and reduced the isolation that such a dramatic shift in life circumstances, for better or worse, is likely to produce. The relationships established with colleagues of similar status at work constitute another important source of peer regulation, in addition to friendship. To maintain good relationships with your colleagues means, among other things, to give credit where credit is due; to take your fair share of the jobs no one wants but still must be done; to deliver on time and in a high-quality manner when teamed with other people; to show up when expected; and, in general, to be trusted to do somewhat more than your job formally requires. The approval or disapproval of your colleagues rewards and enforces this continual reciprocity, and that—like the reciprocity that is necessarily part of friendship—helps maintain stable psychological function. It is much better to be someone who can be relied upon, not least so that during times of personal trouble the people you have worked beside are willing and able to step in and help. Through friendship and collegial relationships we modify our selfish proclivities, learning not to always put ourselves first. Less obviously, but just as importantly, we may also learn to overcome our naive and too empathic proclivities (our tendency to sacrifice ourselves unsuitably and unjustly to predatory others) when our peers advise and encourage us to stand up for ourselves. In consequence, if we are fortunate, we begin to practice true reciprocity, and we gain at least some of the advantage spoken about.Top Dog
It is a good thing to be an authority. People are fragile. Because of that, life is difficult and suffering common. Ameliorating that suffering—ensuring that everyone has food, clean water, sanitary facilities, and a place to take shelter, for starters—takes initiative, effort, and ability. If there is a problem to be solved, and many people involve themselves in the solution, then a hierarchy must and will arise, as those who can do, and those who cannot follow as best they can, often learning to be competent in the process. If the problem is real, then the people who are best at solving the problem at hand should rise to the top. That is not power. It is the authority that properly accompanies ability. Now, it is self-evidently appropriate to grant power to competent authorities, if they are solving necessary problems; and it is equally appropriate to be one of those competent authorities, if possible, when there is a perplexing problem at hand. This might be regarded as a philosophy of responsibility. A responsible person decides to make a problem his or her problem, and then works diligently—even ambitiously—for its solution, with other people, in the most efficient manner possible (efficient, because there are other problems to solve, and efficiency allows for the conservation of resources that might then be devoted importantly elsewhere). Ambition is often—and often purposefully—misidentified with the desire for power, and damned with faint praise, and denigrated, and punished. And ambition is sometimes exactly that wish for undue influence on others. But there is a crucial difference between sometimes and always. Authority is not mere power, and it is extremely unhelpful, even dangerous, to confuse the two. When people exert power over others, they compel them, forcefully. They apply the threat of privation or punishment so their subordinates have little choice but to act in a manner contrary to their personal needs, desires, and values. When people wield authority, by contrast, they do so because of their competence—a competence that is spontaneously recognized and appreciated by others, and generally followed willingly, with a certain relief, and with the sense that justice is being served. Those who are power hungry—tyrannical and cruel, even psychopathic—desire control over others so that every selfish whim of hedonism can be immediately gratified; so that envy can destroy its target; so that resentment can find its expression. But good people are ambitious (and diligent, honest, and focused along with it) instead because they are possessed by the desire to solve genuine, serious problems. That variant of ambition needs to be encouraged in every possible manner. It is for this reason, among many others, that the increasingly reflexive identification of the striving of boys and men for victory with the “patriarchal tyranny” that hypothetically characterizes our modern, productive, and comparatively free societies is so stunningly counterproductive (and, it must be said, cruel: there is almost nothing worse than treating someone striving for competence as a tyrant in training). “Victory,” in one of its primary and most socially important aspects, is the overcoming of obstacles for the broader public good. Someone who is sophisticated as a winner wins in a manner that improves the game itself, for all the players. To adopt an attitude of naive or willfully blind cynicism about this, or to deny outright that it is true, is to position yourself—perhaps purposefully, as people have many dark motives—as an enemy of the practical amelioration of suffering itself. I can think of few more sadistic attitudes. Now, power may accompany authority, and perhaps it must. However, and more important, genuine authority constrains the arbitrary exercise of power. This constraint manifests itself when the authoritative agent cares, and takes responsibility, for those over whom the exertion of power is pos- sible. The oldest child can take accountability for his younger siblings, instead of domineering over and teasing and torturing them, and can learn in that manner how to exercise authority and limit the misuse of power. Even the youngest can exercise appropriate authority over the family dog. To adopt authority is to learn that power requires concern and competence—and that it comes at a genuine cost. Someone newly promoted to a management position soon learns that managers are frequently more stressed by their multiple subordinates than subordinates are stressed by their single manager. Such experience moderates what might otherwise become romantic but dangerous fantasies about the attractiveness of power, and helps quell the desire for its infinite extension. And, in the real world, those who occupy positions of authority in functional hierarchies are generally struck to the core by the responsibility they bear for the people they supervise, employ, and mentor. Not everyone feels this burden, of course. A person who has become established as an authority can forget his origins and come to develop a counterproductive contempt for the person who is just starting out. This is a mistake, not least because it means that the established person cannot risk doing something new (as it would mean adopting the role of despised fool). It is also because arrogance bars the path to learning. Shortsighted, willfully blind, and narrowly selfish tyrants certainly exist, but they are by no means in the majority, at least in functional societies. Otherwise nothing would work. The authority who remembers his or her sojourn as voluntary beginner, by contrast, can retain their identification with the newcomer and the promise of potential, and use that memory as the source of personal information necessary to constrain the hunger for power. One of the things that has constantly amazed me is the delight that decent people take in the ability to provide opportunities to those over whom they currently exercise authority. I have experienced this repeatedly: personally, as a university professor and researcher (and observed many other people in my situation doing the same); and in the business and other professional settings I have become familiar with. There is great intrinsic pleasure in helping already competent and admirable young people become highly skilled, socially valuable, autonomous, responsible professionals. It is not unlike the pleasure taken in raising children, and it is one of the primary motivators of valid ambition. Thus, the position of top dog, when occupied properly, has as one of its fundamental attractions the opportunity to identify deserving individuals at or near the beginning of their professional life, and provide them with the means of productive advancement.Important points about breaking rules
% The actions and attitudes of J. K. Rowling’s heroes and heroines from Harry Potter series provide popular examples of precisely about breaking rules. Harry Potter, Ron Weasley, and Hermione Granger are typified in large part by the willingness and ability to follow rules (indicating their expertise as apprentices) and, simultaneously, to break them. % Have respect for the rules, except when following those rules means disregarding or ignoring or remaining blind to an even higher moral principle. % An ancient document known as the Codex Bezae,12 a noncanonical variant of part of the New Testament, offers an interpolation just after the section of the Gospel of Luke presented above, shedding profound light on the same issue. It offers deeper insight into the complex and paradoxical relationship between respect for the rules and creative moral action that is necessary and desirable, despite manifesting itself in apparent opposition to those rules. It contains an account of Christ addressing someone who, like Him, has broken a sacred rule: “On that same day, observing one working on the Sabbath, [Jesus] said to him O Man, if indeed thou knowest what thou doest, thou art blest; but if thou knowest not, thou art accursed, and a transgressor of the Law.” What does this statement mean? It sums up the meaning of Rule I perfectly. If you understand the rules—their necessity, their sacredness, the chaos they keep at bay, how they unite the communities that follow them, the price paid for their establishment, and the danger of breaking them—but you are willing to fully shoulder the responsibility of making an exception, because you see that as serving a higher good (and if you are a person with sufficient character to manage that distinction), then you have served the spirit, rather than the mere law, and that is an elevated moral act. But if you refuse to realize the importance of the rules you are violating and act out of self-centered convenience, then you are appropriately and inevitably damned. The carelessness you exhibit with regard to your own tradition will undo you and perhaps those around you fully and painfully across time.RULE II: Imagine who you could be, and then aim single-mindedly at that.
Main theme: Psychology Subtheme: Personal Development 1. How to Act People exchange information about how to act in many ways. They observe each other and imitate what they see. When they imitate, they use their bodies to represent the bodies of others. But this imitation is not mindless, automatized mimicry. It is instead the ability to identify regularities or patterns in the behavior of other people, and then to imitate those patterns. When a young girl plays at being a mother, for example, she does not duplicate, gesture for gesture, what she has previously observed of her mother’s actions. Instead she acts “as if” she were a mother. If you ask the girl what she is doing, she will tell you that she is pretending to be a mother, but if you get her to describe what that means, particularly if she is a young child, her description will be far less complete than her actions would indicate. This means she can act out more than she can say—just as we all can. If you observed many little girls, acting out many mothers, you could derive a very good idea of what “mother” meant, in its purest form, even if you had never seen an actual mother. If you were good with words, then perhaps you could describe the essential elements of maternal behavior and transmit them. You might do that best in the form of a story. It is easier and more direct to represent a behavioral pattern with behavior than with words. 2. Aim at something. Pick the best target you can currently conceptualize. Stumble toward it. Notice your errors and misconceptions along the way, face them, and correct them. Get your story straight. Past, present, future—they all matter. You need to map your path. You need to know where you were, so that you do not repeat the mistakes of the past. You need to know where you are, or you will not be able to draw a line from your starting point to your destination. You need to know where you are going, or you will drown in uncertainty, unpredictability, and chaos, and starve for hope and inspiration. For better or worse, you are on a journey. You are having an adventure—and your map better be accurate. Voluntarily confront what stands in your way. The way—that is the path of life, the meaningful path of life, the straight and narrow path that constitutes the very border between order and chaos, and the traversing of which brings them into balance. Aim at something profound and noble and lofty. If you can find a better path along the way, once you have started moving forward, then switch course. Be careful, though; it is not easy to discriminate between changing paths and simply giving up. (One hint: if the new path you see forward, after learning what you needed to learn along your current way, appears more challenging, then you can be reasonably sure that you are not deluding or betraying yourself when you change your mind.) In this manner, you will zigzag forward. It is not the most efficient way to travel, but there is no real alternative, given that your goals will inevitably change while you pursue them, as you learn what you need to learn while you are disciplining yourself. You will then find yourself turning across time, incrementally and gracefully, to aim ever more accurately at that tiny pinpoint, the X that marks the spot, the bull’s-eye, and the center of the cross; to aim at the highest value of which you can conceive. You will pursue a target that is both moving and receding: moving, because you do not have the wisdom to aim in the proper direction when you first take aim; receding, because no matter how close you come to perfecting what you are currently practicing, new vistas of possible perfection will open up in front of you. Discipline and transformation will nonetheless lead you inexorably forward. With will and luck, you will find a story that is meaningful and productive, improves itself with time, and perhaps even provides you with more than a few moments of satisfaction and joy. With will and luck, you will be the hero of that story, the disciplined sojourner, the creative transformer, and the benefactor of your family and broader society. Imagine who you could be, and then aim single-mindedly at that.RULE III: Do not hide unwanted things in the fog.
Those Damned Plates
I love my father-in-law. I respect him, too. He is extremely stable emotionally—one of those tough or fortunate people (perhaps a little of both) who can let the trials and tribulations of life roll off him and keep moving forward with little complaint and plenty of competence. He is an old guy now, Dell Roberts—eighty-eight. He has had a knee replaced, and is planning to get the remaining one done. He has had stents inserted into his coronary arteries and a heart valve replaced. He suffers from drop foot and sometimes slips and falls because of it. But he was still curling a year ago, pushing the heavy granite rock down the ice with a stick specifically designed for people who can no longer crouch down as easily as they once could. When his wife, Beth, now deceased, developed dementia at a relatively young age, he took care of her in as uncomplaining and unresentful a manner as anyone could imagine. It was impressive. I am by no means convinced that I could have fared as well. He cared for her right to the point where it became impossible for him to lift her out of whatever chair she had settled into. This was long after she had lost the ability to speak. But it was obvious by the way her eyes lit up when he entered the room that she still loved him—and the feeling was mutual. I would not describe him as someone who is prone to avoidance when the going gets tough. Quite the contrary. When Dell was a much younger man, he was for several decades a real estate dealer in Fairview, Alberta—the small town where I grew up (we lived right across the street from the Roberts family, in fact). During that time, he habitually went home for lunch, in accordance with the general custom. Beth typically prepared him soup (probably Campbell’s, which everyone ate at that time—“M’m! M’m! Good!”), and a sandwich. One day, without warning, he snapped at his wife: “Why in the world do we always eat off these tiny plates? I hate eating off these tiny plates!” She had been serving the sandwiches on bread-and-butter plates, which average about six or seven inches in diameter, instead of full-size dinner plates of ten to twelve inches. She related this story to her daughters, soon after, in a state of mild shock. This story has been retold to much laughter at family gatherings many times since. After all, she had been serving him lunch on those plates for at least twenty years by the time he finally said anything. She had no idea that he was annoyed by her table settings. He had never objected. And there is something inexhaustibly amusing about that. Now, it is possible that he was irritated by something else altogether that day and did not really care about the plates. And in one sense, it is a trivial issue. But seen another way, it is not trivial at all, for two reasons. First, if something happens every day, it is important, and lunch was happening every day. In consequence, if there was something about it that was chronically bothersome, even in a minor sort of way, it needed to be attended to. Second, it is very common to allow so-called minor irritations (which are not minor, as I said, if they happen constantly) to continue for years without comment or resolution. Here is the problem: Collect a hundred, or a thousand, of those, and your life is miserable and your marriage doomed. Do not pretend you are happy with something if you are not, and if a reasonable solution might, in principle, be negotiated. Have the damn fight. Unpleasant as that might be in the moment, it is one less straw on the camel’s back. And that is particularly true for those daily events that everyone is prone to regard as trivial—even the plates on which you eat your lunch. Life is what repeats, and it is worth getting what repeats right.Just not Worth the Fight
Here is a more serious story of the same type. I had a client who had come to see me about her plans to move to private practice after many years as an accountant with a large corporation. She was well respected in her profession, and was a competent, kind, and careful person. But she was also very unhappy. I presumed initially that her unhappiness stemmed from anxiety about her career transition. But she managed that move without a hitch during the time we continued our sessions, while other issues rose to the forefront. Her problem was not her career change. It was her marriage. She described her husband as extraordinarily self-centered and simultaneously overly concerned with how he appeared in the eyes of others. It was a contradictory combination, in some manner, although it is common enough to see this touching of opposites in a personality: If you lean too far in one direction, something else in you leans equally far in the other. So, despite the husband’s narcissism (at least from his wife’s perspective), he was in thrall to the opinions of everyone he met—excepting the members of his own family. He also drank too much—a habit which exaggerated his temperamental defects. My client was not comfortable in her own home. She did not feel there was anything truly of her within the apartment she shared with her husband (the couple had no children). Her situation provided a good example of how what is outside can profoundly reflect what is inside (which is why I suggest to people who are in psychological trouble that they might begin their recovery by cleaning up—and then beautifying, if possible—their rooms). All their household furnishings, which she described as showy, ornate, and uncomfortable, had been chosen by her husband. Furthermore, he avidly collected 1960s and 70s pop art, and the walls of the house were crowded with these items, which he had spent time seeking out in galleries and otherwise gathering for many years, often while she sat waiting outside in the car. he told me that she did not care about the furnishings and the excess of decorative objects, but that was not really true. What was true was that she did not care for them—not a bit. Neither the showiness nor the furnishings nor the plethora of art works that made up her husband’s collection appealed to her taste. She tended toward a minimalist aesthetic (or perhaps that preference was a consequence of her husband’s decorative excesses). It was never quite clear what she might have preferred, and perhaps that was part of the problem: because she did not know what she liked (and was equally vague about her dislikes), she was not in the strongest position to put forward her own opinions. It is difficult to win an argument, or even begin one, if you have not carefully articulated what you want (or do not) and need (or do not). However, she certainly did not enjoy feeling like a stranger in her own home. For that reason, she never had friends over to visit, which was also a nontrivial problem, contributing as it did to her feelings of isolation. But the furnishings and paintings continued to accrue, one shopping expedition at a time, in Canada and abroad, and with each purchase there was less of her in the house and in the marriage, and increasingly more of her husband. Nonetheless, my client never went to war. She never had a fit of anger. She never put her fist through a particularly objectionable canvas hanging on the living room wall. In all the decades of her married life, she never had an outburst of genuine rage; she never directly and conclusively confronted the fact that she hated her home and her subordination to her husband’s taste. Instead, she let him have his way, repeatedly, increment by increment, because she claimed that such trivialities were not worth fighting for. And with each defeat, the next disagreement became more necessary—although less likely, because she understood that a serious discussion, once initiated, risked expanding to include all the things that were troublesome about her marriage, and that a real, no-holds-barred battle would therefore likely ensue. Then everything wrong might spill out and have to be faced and dealt with, by one means or another. So, she kept silent. But she was chronically repressed and constantly resentful, and felt that she had wasted much of the opportunity of her life. It is a mistake to consider the furnishings and the pop art paintings as simple material objects. They were more truly and importantly containers of information, so to speak, about the state of the marriage, and were certainly experienced as such by my client. Every single object of art was the concrete realization of a victory (Pyrrhic though it may have been) and a defeat (or, at least, a negotiation that did not occur and, therefore, a fight that was over before it started). And there were dozens or perhaps hundreds of these: each a weapon in an unspoken, destructive, and decades-long war. Unsurprisingly, given the circumstances, the couple split up—after thirty years of marriage. I believe the husband retained all the furniture and art. Here is a thought, a terrifying and dispiriting thought, to motivate improvement in your marriage— to scare you into the appalling difficulties of true negotiation. Every little problem you have every morning, afternoon, or evening with your spouse will be repeated for each of the fifteen thousand days that will make up a forty-year marriage. Every trivial but chronic disagreement about cooking, dishes, housecleaning, responsibility for finances, or frequency of intimate contact will be duplicated, over and over, unless you successfully address it. Perhaps you think (moment to moment, at least) that it is best to avoid confrontation and drift along in apparent but false peace. Make no mistake about it, however: you age as you drift, just as rapidly as you age as you strive. But you have no direction when you drift, and the probability that you will obtain what you need and want by drifting aimlessly is very low. Things fall apart of their own accord, but the sins of men speed their deterioration: that is wisdom from the ages. It may well be that conscious apprehension of the horror of the same small hell forever repeated is precisely what is necessary to force you to confront the problems in your marriage and negotiate in good and desperate faith to solve them. However, it is the easiest of matters, particularly in the short term, to ignore the prick of conscience and let the small defeats slide, day after day. This is not a good strategy. Only careful aim and wakeful striving and commitment can eliminate the oft-incremental calamity of willful blindness, stem the entropic tide, and keep catastrophe—familial and social alike—at bay.RULE IV: Note that opportunity lurks where responsibility has been abdicated.
Make yourself invaluable.
In my dual role as clinical psychologist and professor, I have coached many people in the development of their careers. Sometimes those I am coaching consult me because their coworkers, subordinates, or bosses will not do their jobs properly. They are supervised by, working alongside, or managing people who are narcissistic, incompetent, malevolent, or tyrannical. Such things happen and must be dealt with in whatever reasonable manner will bring them to a halt. I do not encourage people to martyr themselves. It is a bad idea to sacrifice yourself uncomplainingly so that someone else can take the credit. Nonetheless, under such circumstances—if you are a wise and attentive person—you might still notice that your unproductive coworkers are leaving a plethora of valuable tasks undone. You might then ask yourself, “What would happen if I took responsibility for doing them?” It is a daunting question. What is left undone is often risky, difficult, and necessary. But that also means—does it not?—that it is worthwhile and significant. And you may have the eyes to see that there is a problem, despite your all-too-frequent blindness. How do you know that it is not, therefore, your problem? Why do you notice this issue and not some other? This is a question worth considering in depth. If you want to become invaluable in a workplace—in any community—just do the useful things no one else is doing. Arrive earlier and leave later than your compatriots (but do not deny yourself your life).[27] Organize what you can see is dangerously disorganized. Work, when you are working, instead of looking like you are working. And finally, learn more about the business—or your competitors—than you already know. Doing so will make you invaluable—a veritable lynchpin. People will notice that and begin to appreciate your hard-earned merits. You might object, “Well, I just could not manage to take on something that important.” What if you began to build yourself into a person who could? You could start by trying to solve a small problem—something that is bothering you, that you think you could fix. You could start by confronting a dragon of just the size that you are likely to defeat. A tiny serpent might not have had the time to hoard a lot of gold, but there might still be some treasure to be won, along with a reasonable probability of succeeding in such a quest (and not too much chance of a fiery or toothsome death). Under reasonable circumstances, picking up the excess responsibility is an opportunity to become truly invaluable. And then, if you want to negotiate for a raise, or more autonomy—or more free time, for that matter—you can go to your boss and say, “Here are ten things that were crying out to be done, each of them vital, and I am now doing all of them. If you help me out a bit, I will continue. I might even improve. And everything, including your life, will improve along with me.” And then, if your boss has any sense—and sometimes bosses do—then your negotiation will be successful. That is how such things work. And do not forget that there is no shortage of genuinely good people who are thrilled if they can give someone useful and trustworthy a hand up. It is one of the truly altruistic pleasures of life, and its depth is not to be underestimated, or to be disregarded with the cheap cynicism that masks itself as world-weary wisdom. It appears that the meaning that most effectively sustains life is to be found in the adoption of responsibility. When people look back on what they have accomplished, they think, if they are fortunate: “Well, I did that, and it was valuable. It was not easy. But it was worth it.” It is a strange and paradoxical fact that there is a reciprocal relationship between the worth of something and the difficulty of accomplishing it. Imagine the following conversation: “Do you want difficulty?” “No, I want ease.” “In your experience, has doing something easy been worthwhile?” “Well, no, not very often.” “Then perhaps you really want something difficult.” I think that is the secret to the reason for Being itself: difficult is necessary. It is for this reason that we voluntarily and happily place limitations on ourselves. Every time we play a game, for example, we accept a set of arbitrary restrictions. We narrow and limit ourselves, and explore the possibilities thereby revealed. That is what makes the game. But it does not work without the arbitrary rules. You take them on voluntarily, absurdly, as in chess: “I can only move this knight in an L. How ridiculous. But how fun!” Because it is not fun, oddly enough, if you can move any piece anywhere. It is not a game anymore if you can make any old move at all. Accept some limitations, however, and the game begins. Accept them, more broadly speaking, as a necessary part of Being and a desirable part of life. Assume you can transcend them by accepting them. And then you can play the limited game properly. And this is all not merely of psychological import, and it is by no means just a game. People need meaning, but problems also need solving. It is very salutary, from the psychological perspective, to find something of significance—something worth sacrificing for (or to), something worth confronting and taking on. But the suffering and malevolence that characterize life are real, with the terrible consequences of the real—and our ability to solve problems, by confronting them and taking them on, is also real. By taking responsibility, we can find a meaningful path, improve our personal lot psychologically, and make what is intolerably wrong genuinely better. Thus, we can have our cake and eat it, too.RULE V: Do not do what you hate.
Pathological Order in its Day-to-Day Guise I once had a client who was subject to a barrage of constant idiocy as part of her work in a giant corporation. She was a sensible, honest person who had withstood and managed a difficult life and who genuinely wished to contribute and work in a manner commensurate with her good sense and honesty. She became subject while employed in the corporate environment to a long, in-person and email-mediated dispute about whether the term “flip chart” (a common phrase, referring to a large pad of paper sheets, typically supported by a tripod) was in fact a term of abuse. For those of you who still find it difficult to believe that conversations such as this occupy the hours of corporate workers, try a quick Google search. “Flip chart derogatory” will suffice. You will see immediately that concern about this issue genuinely and rather widely exists. Many meetings were held by her superiors at work to discuss this issue. “Flip” was apparently at one time a derogatory term for Filipino (I could find little evidence for its use now). Even though the former slur has nothing whatsoever to do with “flip chart,” the administrators of her firm felt that their time was well spent discussing the hypothetically prejudicial nature of the phrase and formulating a replacement term, the use of which eventually became mandatory among employees. This was all despite the fact that no employee of Filipino nationality or descent had ever complained about the corporation’s use of the term. According to the Global Language Monitor (languagemonitor. com), which monitors but does not approve politically correct word usage, the proper term is now “writing block,” despite the fact that a flip chart is in no way a “block.” In any case, the corporation in question settled on “easel pad,” which seems somewhat more descriptively accurate—not that this comparatively elegant solution detracts from the foolishness in question. After all, we are still left with “flip-flopped,” “flippant,” “flip-flops,” “flippers,” and so on, and at least the first two of those sound more derogatory on first exposure than “flip chart,” if we are going to concern ourselves with such things. Now, you might wonder: “What difference does this minor change in terminology really make? It is a trivial problem. Why would someone become concerned about the fact that such change is being discussed? Why not ignore it, as it is best to ignore so much folly, and concentrate on something of more importance?” Because, of course, you could claim that paying attention to someone attending to such issues is as much a waste of time as attending to the discussion in the first place. And I would say that is precisely the conundrum Rule V is trying to address. When do you stop participating in a worrisome process that you see, or think you see, unfolding in front of you? My client first wrote me about the fact that not only was the string of communication discussing the use of “flip chart” well received by her coworkers, but that a contest of sorts immediately emerged to identify and communicate additional words that might also be offensive. “Blackboard” was mentioned, as was “master key” (the former perhaps because referring to anything as “black”—even if it is black—is somehow racist in our hypersensitive times; the latter because of its hypothetical relationship to terminology historically associated with slavery). My client tried to make sense of what she was witnessing: “Such discussions give people the superficial sense of being good, noble, compassionate, openhearted, and wise. So, if for the sake of argument anyone disagrees, how could that person join the discussion without being considered anticompassionate, narrow minded, racist and wicked?” She was also perturbed because no one at her workplace was apparently bothered that any given group of people might endow themselves with the authority to ban words (and to disdain or even discipline those who continued to use them) without perceiving any ethical overreach on their part, and without perceiving the danger of such censorship, which could easily extend, say, to personal opinions, topics of conversation—or, for that matter, books. Finally, she believed that the entire discussion constituted a prime example of “diversity,” “inclusivity,” and “equity”—terms that had become veritable mantras for the departments of Human Resources or Learning and Development (the latter of which she worked for). She regarded them as “engines of corporate indoctrination and ideological propaganda” and as part of the manner in which the political correctness that characterizes, above all, many university programs extends its reach into the broader culture. More importantly, however, she asked me in one of her letters, “Is this a case where enough is enough?” When and where do we stop? If a tiny minority of people even hypothetically finds some words offensive, then what? Do we continue to ban words endlessly?” What my client was perceiving—at least as far as she was concerned—was not a single event, hypothetically capable of heading those involved in it down a dangerous path, but a clearly identifiable and causally related variety or sequence of events, all heading in the same direction. Those events seemed to form a coherent pattern, associated with an ideology that was directional in its intent, explicitly and implicitly. Furthermore, the effect of that directionality had been manifesting itself, by all appearances, for a reasonable amount of time, not only in the corporate world my client inhabited, but in the broader world of social and political institutions surrounding the corporation for which she worked. Although rather isolated in the department she happened to work in (the very epicenter of the ideological blitz of the corporation in question), she could see around her evidence that the processes disturbing her were also having a detrimental effect on other people. And then there was the effect on her conscience. It is important to understand that these issues were not minor philosophical concepts to her. They were bothering her deeply and upsetting her life. It is, of course, the case that being required to do stupid, hateful things is demoralizing. Someone assigned a pointless or even counterproductive task will deflate, if they have any sense, and find within themselves very little motivation to carry out the assignment. Why? Because every fiber of their genuine being fights against that necessity. We do the things we do because we think those things important, compared to all the other things that could be important. We regard what we value as worthy of sacrifice and pursuit. That worthiness motivates us to act, despite the fact that action is difficult and dangerous. When we are called upon to do things that we find hateful and stupid, we are simultaneously forced to act contrary to the structure of values motivating us to move forward stalwartly and protecting us from dissolution into confusion and terror. “To thine own self be true,”[33] as Polonius has it, in Shakespeare’s Hamlet. That “self”—that integrated psyche—is in truth the ark that shelters us when the storms gather and the water rises. To act in violation of its precepts—its fundamental beliefs—is to run our own ship onto the shoals of destruction. To act in violation of the precepts of that fundamental self is to cheat in the game we play with ourselves, to suffer the emptiness of betrayal, and to perceive abstractly and then experience in embodied form the loss that is inevitably to come. What price did my client pay for her initial subjugation to the arbitrary dictates of her managers? She was an immigrant from a former Soviet bloc country and had experienced more than a sufficient taste of authoritarian ideology. In consequence, her inability to determine how she might object to what was happening left her feeling both weak and complicit. Furthermore, no sensible person could possibly remain motivated to put forth effort anywhere such as her workplace had become, where absurdities of a conceptual sort were not only continually occurring but encouraged or, even worse, required. Such “action” makes a mockery of productive work itself—even the very idea of productive work (and that is in fact part of the true motivation for such behavior: those jealous of genuine competence and productivity have every reason to undermine and denigrate even the concept of both). So, what did she do about the demoralizing state in which she found herself? My client did not feel sufficiently confident in her position or in the ability of her managers to engage in a genuine conversation with them about her objections, although it was clear from my conversations with her that she wished very much to escape from the situation. In consequence, she began to develop what might be considered a rearguard action. She was already involved in developing in-house education projects for the company, as we mentioned. It was possible for her, therefore, to begin to branch out, offering her services as a speaker at a variety of corporate conferences. Although she never directly confronted the flip chart issue (and may have been wise to avoid doing so), she began to speak out against the kind of pseudoscience that characterizes many of the ideas that corporate managers, particularly in Human Resources departments, regard as valid. She presented a number of talks, for example, criticizing the widespread fad of “learning styles”—a theory predicated on the idea that there are between four and eight different modalities that individuals prefer and that aid them if used when they are trying to master new ideas. These include, for example, visual, auditory, verbal, physical, and logical, among others. The problem with the learning styles theory? Most basically: there is simply no evidence whatsoever for its validity. First, although students may express a preference for information being delivered in one form over another, practically delivering it in that form does not improve their academic performance.[34] Second (and this makes sense, given the first problem), there is no evidence that teachers can accurately assess the “learning style” of their students.[35] So, although it was not possible for my client to directly confront the particular foolishness that was disturbing her, after long strategizing and much work she did manage to push back very effectively against the ignorance that characterized what passed for psychological knowledge among a substantial subset of her coworkers (as well as those who worked in other companies, where the same things were taking place). She had also done some work as a journalist for one of the major newspapers in Albania, her country of origin, and began to make continuing to do so a higher priority. This did not pay well, but she developed a stellar professional reputation there, and fought hard in print for what she believed in, warning the citizens of her once-Communist-dominated state of the move toward totalitarian opinion beginning to make itself attractive to people in the West. What price did she pay for her decision to stand up and fight? To begin with, she had to face her fear of reprisal, as well as the fact that such fear—in combination with the profound distaste she felt for the ideological maneuvers characterizing her workplace—was destroying her interest in her office profession, as well as making her feel inadequate and cowardly. Then, she had to broaden her professional activities: first, taking the risk of offering herself as a speaker at corporate conventions (and people are generally very loath to talk publicly—it is a common fear, often severe enough to interfere with career progression[36]); second, mastering the literature, enabling her to speak in a credible and informed manner; and third, presenting material that, given its critical nature, was bound to offend a reasonable proportion of those in the audience (precisely those who had accepted and who were propagating the theories that she was now discrediting). This all meant the facing of her fear—of inaction, as well as action. These moves challenged her deeply—but the consequence was an expansion of personality and competence, as well as the knowledge that she was making a genuine social contribution. I believe that the good that people do, small though it may appear, has more to do with the good that manifests broadly in the world than people think, and I believe the same about evil. We are each more responsible for the state of the world than we believe, or would feel comfortable believing. Without careful attention, culture itself tilts toward corruption. Tyranny grows slowly, and asks us to retreat in comparatively tiny steps. But each retreat increases the possibility of the next retreat. Each betrayal of conscience, each act of silence (despite the resentment we feel when silenced), and each rationalization weakens resistance and increases the probability of the next restrictive move forward. This is particularly the case when those pushing forward delight in the power they have now acquired—and such people are always to be found. Better to stand forward, awake, when the costs are relatively low—and, perhaps, when the potential rewards have not yet vanished. Better to stand forward before the ability to do so has been irretrievably compromised. Unfortunately, people often act in spite of their conscience—even if they know it—and hell tends to arrive step by step, one betrayal after another. And it should be remembered that it is rare for people to stand up against what they know to be wrong even when the consequences for doing so are comparatively slight. And this is something to deeply consider, if you are concerned with leading a moral and careful life: if you do not object when the transgressions against your conscience are minor, why presume that you will not willfully participate when the transgressions get truly out of hand? Part of moving Beyond Order is knowing when you have such a reason. Part of moving Beyond Order is understanding that your conscience has a primary claim on your action, which supersedes your conventional social duty. If you decide to stand up and refuse a command, if you do something of which others disapprove but you firmly believe to be correct, you must be in a position to trust yourself. This means that you must have attempted to live an honest, meaningful, productive life (of precisely the sort that might characterize someone else you would tend to trust). If you have acted honorably, so that you are a trustworthy person, it will be your decision to refuse to comply or to act in a manner contrary to public expectation that will help society itself maintain its footing. By doing so you can be part of the force of truth that brings corruption and tyranny to a halt. The sovereign individual, awake and attending to his or her conscience, is the force that prevents the group, as the necessary structure guiding normative social relations, from becoming blind and deadly. I do not want to end this section on a falsely optimistic note. I know from further correspondence with my client that she shifted her employment from one large organization to another several times in the years that followed. In one case, she found a good position, where it was possible to engage in productive, sensible, meaningful work. However, although successful there, she was laid off during a corporate reorganization, and has since found the other companies she has worked for as thoroughly possessed by the current linguistic and identity-politics fads as her original place of employment. Some dragons are everywhere, and they are not easy to defeat. But her attempts to fight back—her work debunking pseudoscientific theories; her work as a journalist—helped buttress her against depression and bolster her self-regard.Fortify your position.
When culture disintegrates—because it refuses to be aware of its own pathology; because the visionary hero is absent—it descends into the chaos that underlies everything. Under such conditions, the individual can dive voluntarily as deeply as he or she dares into the depths and rediscover the eternal principles renewing vision and life. The alternative is despair, corruption, and nihilism—thoughtless subjugation to the false words of totalitarian utopianism and life as a miserable, lying, and resentful slave. If you wish instead to be engaged in a great enterprise—even if you regard yourself as a mere cog—you are required not to do things you hate. You must fortify your position, regardless of its meanness and littleness, confront the organizational mendacity undermining your spirit, face the chaos that ensues, rescue your near-dead father from the depths, and live a genuine and truthful life. Otherwise, nature hides her face, society stultifies, and you remain a marionette, with your strings pulled by demonic forces operating behind the scenes—and one more thing: it is your fault. No one is destined in the deterministic sense to remain a puppet. We are not helpless. Even in the rubble of the most broken-down lives, useful weapons might still be found. Likewise, even the giant most formidable in appearance may not be as omnipotent as it proclaims or appears. Allow for the possibility that you may be able to fight back; that you may be able to resist and maintain your soul—and perhaps even your job. (But a better job may also beckon if you can tolerate the idea of the transformation.) If you are willing to conceptualize yourself as someone who could—and, perhaps more importantly, should—stand fast, you may begin to perceive the weapons at your disposal. If what you are doing is causing you to lash out at others impulsively; if what you are doing is destroying your motivation to move forward; if your actions and inactions are making you contemptuous of yourself and, worse, of the world; if the manner in which you conduct your life is making it difficult for you to wake happily in the morning; if you are plagued by a deep sense of self-betrayal—perhaps you are choosing to ignore that still small voice, inclined as you may be to consider it something only attended to by the weak and naive. If you are at work, and called upon to do what makes you contemptuous of yourself—weak and ashamed, likely to lash out at those you love, unwilling to perform productively, and sick of your life—it is possible that it is time to meditate, consider, strategize, and place yourself in a position where you are capable of saying no.32 Perhaps you will garner additional respect from the people you are opposing on moral grounds, even though you may still pay a high price for your actions. Perhaps they will even come to rethink their stance—if not now, with time (as their own consciences might be plaguing them in that same still small manner).Practicalities
Perhaps you should also be positioning yourself for a lateral move—into another job, for example, noting as you may, “This occupation is deadening my soul, and that is truly not for me. It is time to take the painstaking steps necessary to organize my CV, and to engage in the difficult, demanding, and often unrewarding search for a new job” (but you have to be successful only once). Maybe you can find something that pays better and is more interesting, and where you are working with people who not only fail to kill your spirit, but positively rejuvenate it. Maybe following the dictates of conscience is in fact the best possible plan that you have—at minimum, otherwise you have to live with your sense of self-betrayal and the knowledge that you put up with what you truly could not tolerate. Nothing about that is good. I might get fired. Well, prepare now to seek out and ready yourself for another job, hopefully better (or prepare yourself to go over your manager’s head with a well-prepared and articulate argument). And do not begin by presuming that leaving your job, even involuntarily, is necessarily for the worst. I am afraid to move. Well, of course you are, but afraid compared to what? Afraid in comparison to continuing in a job where the center of your being is at stake; where you become weaker, more 32Perhaps not just once, because that makes your reaction too impulsive; perhaps not just twice, because that still may not constitute sufficient evidence to risk undertaking what might be a genuine war; but definitively three times, when a pattern has been clearly established. contemptible, more bitter, and more prone to pressure and tyranny over the years? There are few choices in life where there is no risk on either side, and it is often necessary to contemplate the risks of staying as thoroughly as the risks of moving. I have seen many people move, sometimes after several years of strategizing, and end up in better shape, psychologically and pragmatically, after their time in the desert. Perhaps no one else would want me. Well, the rejection rate for new job applications is extraordinarily high. I tell my clients to assume 50:1, so their expectations are set properly. You are going to be passed over, in many cases, for many positions for which you are qualified. But that is rarely personal. It is, instead, a condition of existence, an inevitable consequence of somewhat arbitrary subjection to the ambivalent conditions of worth characterizing society. It is the consequence of the fact that CVs are easy to disseminate and difficult to process; that many jobs have unannounced internal candidates (and so are just going through the motions); and that some companies keep a rolling stock of applicants, in case they need to hire quickly. That is an actuarial problem, a statistical problem, a baseline problem—and not necessarily an indication that there is something specifically flawed about you. You must incorporate all that sustainingly pessimistic realism into your expectations, so that you do not become unreasonably downhearted. One hundred and fifty applications, carefully chosen; three to five interviews thereby acquired. That could be a mission of a year or more. That is much less than a lifetime of misery and downward trajectory. But it is not nothing. You need to fortify yourself for it, plan, and garner support from people who understand what you are up to and are realistically appraised of the difficulty and the options. Now it may also be that you are lagging in the development of your skills and could improve your performance at work so that your chances of being hired elsewhere are heightened. But there is no loss in that. You cannot effectively pronounce “no” in the presence of corrupt power when your options to move are nonexistent. In consequence, you have a moral obligation to place yourself in a position of comparative strength, and to do then what is necessary to capitalize on that strength. You may also have to think through worst-case situations and to discuss them with those who will be affected by your decisions. But it is once again worth realizing that staying where you should not be may be the true worst-case situation: one that drags you out and kills you slowly over decades. That is not a good death, even though it is slow, and there is very little in it that does not speak of the hopelessness that makes people age quickly and long for the cessation of career and, worse, life. That is no improvement. As the old and cruel cliché goes: If you must cut off a cat’s tail, do not do it half an inch at a time. You may well be in for a few painful years of belated recognition of insufficiency, and required to send out four or five or ten job applications a week, knowing full well that the majority will be rejected with less than a second look. But you need to win the lottery only once, and a few years of difficulty with hope beat an entire dejected lifetime of a degenerating and oppressed career. And let us be clear: It is not a simple matter of hating your job because it requires you to wake up too early in the morning, or to drag yourself to work when it is too hot or cold or windy or dry or when you are feeling low and want to curl up in bed. It is not a matter of frustration generated when you are called on to do things that are menial or necessary such as emptying garbage cans, sweeping floors, cleaning bathrooms, or in any other manner taking your lowly but well-deserved place at the bottom of the hierarchy of competence—even of seniority. Resentment generated by such necessary work is most often merely ingratitude, inability to accept a lowly place at the beginning, unwillingness to adopt the position of the fool, or arrogance and lack of discipline. Refusal of the call of conscience is by no means the same thing as irritation about undesirably low status. That rejection—that betrayal of soul—is truly the requirement to perform demonstrably counterproductive, absurd, or pointless work; to treat others unjustly and to lie about it; to engage in deceit, to betray your future self; to put up with unnecessary torture and abuse (and to silently watch others suffer the same treatment). That rejection is the turning of a blind eye, and the agreement to say and do things that betray your deepest values and make you a cheat at your own game. And there is no doubt that the road to hell, personally and socially, is paved not so much with good intentions as with the adoption of attitudes and undertaking of actions that inescapably disturb your conscience. Do not do what you hate.RULE VI: Abandon ideology.
Main theme: Psychology Subtheme: Politics Chapter with commentary on ideologies like socialism, communism, national socialism, and on ideologies from people like Karl Marx, Vladimir Ilyich Ulyanov (better known as Vladimir Lenin), Sigmund Freud. Link to the chapterRULE VII: Work as hard as you possibly can on at least one thing and see what happens.
The Value of Heat and Pressure
When coal is subjected to intense heat and pressure, far below the Earth’s surface, its atoms rearrange themselves into the perfect repeating crystalline alignment characterizing a diamond. The carbon that makes up coal also becomes maximally durable in its diamond form (as diamond is the hardest of all substances). Finally, it becomes capable of reflecting light. This combination of durability and glitter gives a diamond the qualities that justify its use as a symbol of value. That which is valuable is pure, properly aligned, and glitters with light—and this is true for the person just as it is for the gem. Light, of course, signifies the shining brilliance of heightened and focused consciousness. Human beings are conscious during the day, when it is light. Much of that consciousness is visual and therefore dependent on light. To be illumined or enlightened is to be exceptionally awake and aware—to attain a state of being commonly associated with divinity. To wear a diamond is to become associated with the radiance of the Sun, like the king or queen whose profile is stamped on the sunlike disc of the gold coin, a near-universal standard of worth. Heat and pressure transform the base matter of common coal into the crystalline perfection and rare value of the diamond. The same can be said of a person. We know that the multiple forces operating in the human soul are often not aligned with one another. We do the things we wish we would not do and do not do the things we know we should do. We want to be thin, but we sit on the couch eating Cheetos and despairing. We are directionless, confused, and paralyzed by indecision. We are pulled in all directions by temptations, despite our stated will, and we waste time, procrastinate, and feel terrible about it, but we do not change. It was for such reasons that archaic people found it easy to believe that the human soul was haunted by ghosts—possessed by ancestral spirits, demons, and gods—none of whom necessarily had the best interests of the person at heart. Since the time of the psychoanalysts, these contrary forces, these obsessive and sometimes malevolent spirits, have been conceptualized psychologically as impulses, emotions, or motivational states—or as complexes, which act like partial personalities united within the person by memory but not by intent. Our neurological structure is indeed hierarchical. The powerful instinctual servants at the bottom, governing thirst, hunger, rage, sadness, elation, and lust, can easily ascend and become our masters, and just as easily wage war with one another. The resilience and strength of a united spirit is not easy to attain. A house divided against itself, proverbially, cannot stand. Likewise, a poorly integrated person cannot hold himself together when challenged. He loses union at the highest level of psychological organization. He loses the properly balanced admixture of properties that is another feature of the well-tempered soul, and cannot hold his self together. We know this when we say “He lost it” or “He just fell apart.” Before he picks up the pieces and rearranges them, such a person is likely to fall prey to domination by one or more partial personalities. This might be a spirit of rage, or anxiety, or pain, leaping in to occupy the person when his temper is lost. You can see this occurring most clearly in the case of a two-year-old having a tantrum. He has lost himself temporarily, and is for the moment pure emotion. This is an occurrence that is often deeply upsetting to the two-year-old himself, and one of an intensity that would be terrifying to beholders if manifested by an adult. The archaic motivational systems governing anger merely push the toddler’s developing personality aside, and have their way with his mind and actions. This is a true and unfortunate defeat for the still-fragile centralizing ego, struggling against powerful forces toward psychological and social integration. Lack of internal union also makes itself known in the increased suffering, magnification of anxiety, absence of motivation, and lack of pleasure that accompany indecision and uncertainty. The inability to decide among ten things, even when they are desirable, is equivalent to torment by all of them. Without clear, well-defined, and noncontradictory goals, the sense of positive engagement that makes life worthwhile is very difficult to obtain. Clear goals limit and simplify the world, as well, reducing uncertainty, anxiety, shame, and the self-devouring physiological forces unleashed by stress. The poorly integrated person is thus volatile and directionless—and this is only the beginning. Sufficient volatility and lack of direction can rapidly conspire to produce the helplessness and depression characteristic of prolonged futility. This is not merely a psychological state. The physical consequences of depression, often preceded by excess secretion of the stress hormone cortisol, are essentially indistinguishable from rapid aging (weight gain, cardiovascular problems, diabetes, cancer, and Alzheimer’s). The social consequences are just as serious as the biological. A person who is not well put together overreacts to the slightest hint of frustration or failure. He cannot enter into productive negotiations, even with himself, because he cannot tolerate the uncertainty of discussing potential alternative futures. He cannot be pleased, because he cannot get what he wants, and he cannot get what he wants because he will not choose one thing instead of another. He can also be brought to a halt by the weakest of arguments. One of his multiple, warring subpersonalities will latch on to such arguments, often contrary to his best interest, and use them, in the form of doubts, to buttress its contrarian position. A deeply conflicted person can therefore be stopped, metaphorically, with the pressure of a single finger exerted on his chest (even though he may lash out against such an obstacle). To move forward with resolve, it is necessary to be organized—to be directed toward something singular and identifiable. Aim. Point. All this is part of maturation and discipline, and something to be properly valued. If you aim at nothing, you become plagued by everything. If you aim at nothing, you have nowhere to go, nothing to do, and nothing of high value in your life, as value requires the ranking of options and sacrifice of the lower to the higher. Do you really want to be anything you could be? Is that not too much? Might it not be better to be something specific (and then, perhaps, to add to that)? Would that not come as a relief—even though it is also a sacrifice?RULE VIII: Try to make one room in your home as beautiful as possible.
Cleaning your room is not enough.
I have become known for encouraging people to clean up their rooms. Perhaps that is because I am serious about that prosaic piece of advice, and because I know that it is a much more difficult task than it appears. I have been unsuccessfully cleaning up my room, by the way—my home office (which I generally keep in relatively pristine condition)—for about three years now. My life was thrown into such chaos over that period by the multitude of changes I experienced—political controversies, transformation of career, endless travel, mountains of mail, the sequence of illnesses—that I simply became overwhelmed. The disorganization was heightened by the fact that my wife and I had just finished having much of our house renovated, and everything we could not find a proper place for ended up in my office. There is a meme floating around the internet, accusing me of hypocrisy on account of this: a still taken from a video I shot in my office, with a fair bit of mess in the background (and I cannot say that I look much better myself). Who am I to tell people to clean up their rooms before attempting to fix the rest of the world when, apparently, I cannot do it myself? And there is something directly synchronistic and meaningful about that objection, because I am not in proper order at that moment myself, and my condition undoubtedly found its reflection in the state of my office. More piled up every day, as I traveled, and everything collected around me. I plead exceptional circumstances, and I put many other things in order during the time my office was degenerating, but I still have a moral obligation to get back in there and put it right. And the problem is not just that I want to clean up the mess. I also want to make it beautiful: my room, my house, and then, perhaps, in whatever way I can manage, the community. God knows it is crying out for it. Making something beautiful is difficult, but it is amazingly worthwhile. If you learn to make something in your life truly beautiful—even one thing—then you have established a relationship with beauty. From there you can begin to expand that relationship out into other elements of your life and the world. That is an invitation to the divine. That is the reconnection with the immortality of childhood, and the true beauty and majesty of the Being you can no longer see. You must be daring to try that. If you study art (and literature and the humanities), you do it so that you can familiarize yourself with the collected wisdom of our civilization. This is a very good idea—a veritable necessity—because people have been working out how to live for a long time. What they have produced is strange but also rich beyond comparison, so why not use it as a guide? Your vision will be grander and your plans more comprehensive. You will consider other people more intelligently and completely. You will take care of yourself more effectively. You will understand the present more profoundly, rooted as it is in the past, and you will come to conclusions much more carefully. You will come to treat the future, as well, as a more concrete reality (because you will have developed some true sense of time) and be less likely to sacrifice it to impulsive pleasure. You will develop some depth, gravitas, and true thoughtfulness. You will speak more precisely, and other people will become more likely to listen to and cooperate productively with you, as you will with them. You will become more your own person, and less a dull and hapless tool of peer pressure, vogue, fad, and ideology. Buy a piece of art. Find one that speaks to you and make the purchase. If it is a genuine artistic production, it will invade your life and change it. A real piece of art is a window into the transcendent, and you need that in your life, because you are finite and limited and bounded by your ignorance. Unless you can make a connection to the transcendent, you will not have the strength to prevail when the challenges of life become daunting. You need to establish a link with what is beyond you, like a man overboard in high seas requires a life preserver, and the invitation of beauty into your life is one means by which that may be accomplished. It is for such reasons that we need to understand the role of art, and stop thinking about it as an option, or a luxury, or worse, an affectation. Art is the bedrock of culture itself. It is the foundation of the process by which we unite ourselves psychologically, and come to establish productive peace with others. As it is said, “Man shall not live by bread alone” (Matthew 4:4). That is exactly right. We live by beauty. We live by literature. We live by art. We cannot live without some connection to the divine—and beauty is divine—because in its absence life is too short, too dismal, and too tragic. And we must be sharp and awake and prepared so that we can survive properly, and orient the world properly, and not destroy things, including ourselves—and beauty can help us appreciate the wonder of Being and motivate us to seek gratitude when we might otherwise be prone to destructive resentment.The Land you Know, the Land you do not Know, and the Land you cannot even Imagine
You inhabit the land you know, pragmatically and conceptually. But imagine what lies just outside of that. There exists an immense space of things you do not know, but which other people might comprehend, at least in part. Then, outside of what anyone knows, there is the space of things that no one at all knows. Your world is known territory, surrounded by the relatively unknown, surrounded by the absolutely unknown—surrounded, even more distantly, by the absolutely unknowable. Together, that is the canonical, archetypal landscape. The unknown manifests itself to you in the midst of the known. That revelation—sometimes exciting, but often quite painful—is the source of new knowledge. But a fundamental question remains: How is that knowledge generated? What is comprehended and understandable does not just leap in one fell swoop from the absolutely unknown to the thoroughly and self-evidently articulated. Knowledge must pass through many stages of analysis—a multitude of transformations—before it becomes, let us say, commonplace. The first stage is that of pure action—reflex action, at the most basic of levels.[50] If something surprises you, you react to it first with your body. You crouch defensively, or freeze, or run away in panic. Those are all forms of representation and categorization, in nascent form. Crouch means predatory attack. Freeze means predatory threat. Panic means terror necessitating escape. The world of possibility begins to actualize itself with such instinctual, embodied action, unconscious and uncontrollable. The first realization of possibility, of potential, is not conceptual. It is embodied, but it is still representational. (It is no longer the thing in itself we referred to earlier, but the transmutation of that thing into a commensurate physical response. That is a representation.) Maybe you are at home, at night. Assume you are alone. It is dark and late. An unexpected noise startles you, and you freeze. That is the first transmutation: unknown noise (a pattern) to frozen position. Then your heart rate rises, in preparation for (unspecified) action.[51] That is the second transmutation. You are preparing to move. Next, your imagination populates the darkness with whatever might be making the noise.[52] That is the third transmutation, part of a complete and practical sequence: embodied responses (freezing and heart-rate increase) and then imagistic, imaginative representation. The latter is part of exploration, which you might extend by overcoming your terror and the freezing associated with it (assuming nothing else too unexpected happens) and investigating the locale, once a part of your friendly house, from where the noise appeared to emanate. You have now engaged in active exploration—a precursor to direct perception (hopefully nothing too dramatic); then to explicit knowledge of the source; and then back to routine and complacent peace, if the noise proves to be nothing of significance. That is how information moves from the unknown to the known. (Except that sometimes the noise does not prove insignificant. Then there is trouble.) Artists are the people who stand on the frontier of the transformation of the unknown into knowledge. They make their voluntary foray out into the unknown, and they take a piece of it and transform it into an image. Maybe they do it through choreography and dance—by representing the manifestation of the world in physical display, communicable, although not in words, to others. Maybe they do it by acting, which is a sophisticated form of embodiment and imitation, or by painting or sculpting. Perhaps they manage it through screenwriting, or by penning a novel. After all that come the intellectuals, with philosophy and criticism, abstracting and articulating the work’s representations and rules. Consider the role that creative people play in cities. They are typically starving a bit, because it is virtually impossible to be commercially successful as an artist, and that hunger is partly what motivates them (do not underestimate the utility of necessity). In their poverty, they explore the city, and they discover some ratty, quasi-criminal area that has seen better days. They visit, look, and poke about, and they think, “You know, with a little work, this area could be cool.” Then they move in, piece together some galleries, and put up some art. They do not make any money, but they civilize the space a bit. In doing so, they elevate and transform what is too dangerous into something cutting edge. Then a coffee shop pops up, and maybe an unconventional clothing store. The next thing you know, the gentrifiers move in. They are creative types, too, but more conservative (less desperate, perhaps; more risk averse, at least—so they are not the first ones on the edge of the frontier). Then the developers show up. And then the chain stores appear, and the middle or upper class establishes itself. Then the artists have to move, because they can no longer afford the rent. That is a loss for the avant-garde, but it is okay, even though it is harsh, because with all that stability and predictability the artists should not be there anymore. They need to rejuvenate some other area. They need another vista to conquer. That is their natural environment. That edge, where artists are always transforming chaos into order, can be a very rough and dangerous place. Living there, an artist constantly risks falling fully into the chaos, instead of transforming it. But artists have always lived there, on the border of human understanding. Art bears the same relationship to society that the dream bears to mental life. You are very creative when you are dreaming. That is why, when you remember a dream, you think, “Where in the world did that come from?” It is very strange and incomprehensible that something can happen in your head, and you have no idea how it got there or what it means. It is a miracle: nature’s voice manifesting itself in your psyche. And it happens every night. Like art, the dream mediates between order and chaos. So, it is half chaos. That is why it is not comprehensible. It is a vision, not a fully fledged articulated production. Those who actualize those half-born visions into artistic productions are those who begin to transform what we do not understand into what we can at least start to see. That is the role of the artist, occupying the vanguard. That is their biological niche. They are the initial civilizing agents. The artists do not understand full well what they are doing. They cannot, if they are doing something genuinely new. Otherwise, they could just say what they mean and have done with it. They would not require expression in dance, music, and image. But they are guided by feel, by intuition—by their facility with the detection of patterns—and that is all embodied, rather than articulated, at least in its initial stages. When creating, the artists are struggling, contending, and wrestling with a problem— maybe even a problem they do not fully understand—and striving to bring something new into clear focus. Otherwise they are mere propagandists, reversing the artistic process, attempting to transform something they can already articulate into image and art for the purpose of rhetorical and ideological victory. That is a great sin, harnessing the higher for the purposes of the lower. It is a totalitarian tactic, the subordination of art and literature to politics (or the purposeful blurring of the distinction between them). Artists must be contending with something they do not understand, or they are not artists. Instead, they are posers, or romantics (often romantic failures), or narcissists, or actors (and not in the creative sense). They are likely, when genuine, to be idiosyncratically and peculiarly obsessed by their intuition—possessed by it, willing to pursue it even in the face of opposition and the overwhelming likelihood of rejection, criticism, and practical and financial failure. When they are successful they make the world more understandable (sometimes replacing something more “understood,” but now anachronistic, with something new and better). They move the unknown closer to the conscious, social, and articulated world. And then people gaze at those artworks, watch the dramas, and listen to the stories, and they start to become informed by them, but they do not know how or why. And people find great value in it—more value, perhaps, than in anything else. There is good reason that the most expensive artifacts in the world—those that are literally, or close to literally, priceless—are great works of art. I once visited the Metropolitan Museum of Art in New York. It contained a collection of great and famous Renaissance paintings—each worth hundreds of millions of dollars, assuming they were ever made available for purchase. The area containing them was a shrine, a place of the divine—for believers and atheists alike. It was in the most expensive and prestigious of museums, located on real estate of the highest quality and desirability, in what might well be the most active and exciting city in the world. The collection had been put together over a great expanse of time, and with much difficulty. The gallery was packed with people, many of whom had voyaged there as part of what must be most properly regarded as a pilgrimage. I asked myself, “What are these people up to, coming to this place, so carefully curated, traveling these great distances, looking at these paintings? And what do they believe they are up to?” One painting featured the Immaculate Conception of Mary, brilliantly composed. The Mother of God was rising to heaven, in a beatific state, encapsulated in a mandorla of clouds, embedded with the faces of putti. Many of the people gathered were gazing, enraptured, at the work. I thought, “They do not know what that painting means. They do not understand the symbolic meaning of the mandorla, or the significance of the putti, or the idea of the glorification of the Mother of God. And God, after all, is dead—or, so goes the story. Why does the painting nonetheless retain its value? Why is it in this room, in this building, with these other paintings, in this city—carefully guarded, not to be touched? Why is this painting—and all these others—beyond price and desired by those who already have everything? Why are these creations stored so carefully in a modern shrine, and visited by people from all over the world, as if it were a duty—even as if it were desirable or necessary?” We treat these objects as if they are sacred. At least that is what our actions in their vicinity suggest. We gaze at them in ignorance and wonder, and remember what we have forgotten; perceiving, ever so dimly, what we can no longer see (what we are perhaps no longer willing to see). The unknown shines through the productions of great artists in partially articulated form. The awe-inspiring ineffable begins to be realized but retains a terrifying abundance of its transcendent power. That is the role of art, and that is the role of artists. It is no wonder we keep their dangerous, magical productions locked up, framed, and apart from everything else. And if a great piece is damaged anywhere, the news spreads worldwide. We feel a tremor run through the bedrock of our culture. The dream upon which our reality depends shakes and moves. We find ourselves unnerved.RULE IX: If old memories still upset you, write them down carefully and completely.
Main theme: Psychology Sub-theme: PsychotherapyRULE X: Plan and work diligently to maintain the romance in your relationship.
Negotiation, Tyranny, or Slavery Negotiation is exceptionally difficult. We already discussed the problems associated with determining what you want and then mustering up the courage to tell someone exactly that. And there are the tricks that people use, too, to avoid negotiation. Perhaps you ask your partner what he or she wants—perhaps during a difficult situation. “I don’t know” is a common answer (you get that from children, too, and even more often from adolescents). It is not acceptable, however, in a discussion that cannot in good faith be avoided. Sometimes “I don’t know” truly means what it is supposed to mean—the person who utters the phrase is at a genuine loss—but often it means, instead: “I don’t want to talk about it, so go away and leave me alone.” Irritation or outright anger, sufficient to deter the questioner, often accompanies this response. That brings the discussion to a halt, and it can stay halted forever. Maybe that has happened once or twice or a dozen times too often, so you—the questioner, in this instance—have had enough of your partner’s refusal, or you have decided that you are done being cowardly or a victim of your own misplaced compassion and you are not about to take “I don’t know” for an answer. In consequence, you persist in pursuing your target. “Well, guess,” you might say. “Throw something on the table, for God’s sake. I do not care what it is. Even if it is wrong, it is at least a start.” “I don’t know” means not only “Go away and leave me alone.” It also frequently means “Why don’t you go away, do all the work necessary to figure out what is wrong, and come back and tell me—if you’re so smart,” or “It is intolerably rude of you to refuse to allow me to remain in my willful or dangerous ignorance, given that it obviously bothers me so much to think about my problems.” It is not rude, though—or even if it is, you still need to know what your partner wants, and so does he or she, and how in the world are either or both of you going to figure it out if you cannot even get the conversation off the ground? It is not rude. It is a cruel act of love. Persistence under such conditions is a necessity, a terrible necessity, akin to surgery. It is difficult and painful because it takes courage and even some foolhardiness to continue a discussion when you have been told in no uncertain terms by your partner to go the hell away (or worse). It is a good thing, however—an admirable act—because a person bothered by something they do not wish to talk about is very likely to be split internally over the issue at hand. The part that wants to avoid is the part that gets angry. There is a part that wants to talk, too, and to settle the issue. But doing so is going to be cognitively demanding, ethically challenging, and emotionally stressful. In addition, it is going to require trust, and people test trust, not least by manifesting anger when approached about something touchy just to determine if the person daring the approach cares sufficiently to overcome a serious barrier or two or three or ten to get to the horrible bottom of things. And avoidance followed by anger is not the only trick in the book. The next serious hurdle is tears. Tears are easily mistaken for the distress due to sadness, and they are very effective at bringing tenderhearted people to a dead halt as a consequence of their misplaced compassion. (Why misplaced? Because if you leave the person alone because of their tears, they quit suffering right then, but continue with their unresolved problem until they solve it, which might be never.) Tears, however, are just as often anger (perhaps more often) as they are sadness or distress. If the person you are chasing down and cornering is red-faced, for example, in addition to their tears, then he or she is probably angry, not hurt (that is not inevitably the case, but it is a reasonably common sign). Tears are an effective defense mechanism, as it takes a heart of stone to withstand them, but they tend to be the last-ditch attempt at avoidance. If you can get past tears, you can have a real conversation, but it takes a very determined interlocutor to avoid the insult and hurt generated by anger (defense one) and the pity and compassion evoked by tears (defense two). It requires someone who has integrated their shadow (their stubbornness, harshness, and capacity for necessary emotionless implacability) and can use it for long-term benefit. Do not foolishly confuse “nice” with “good.” Remember the options previously discussed: negotiation, tyranny, or slavery. Of those, negotiation is the least awful, even though it is no joke to negotiate, and it is perhaps the most difficult of the three, in the short term, because you have to fight it out, now, and God only knows how deep you are going to have to go, how much diseased tissue you will have to remove. For all you know, you are fighting with the spirit of your wife’s grandmother, who was treated terribly by her alcoholic husband, and the consequences of that unresolved abuse and distrust between the sexes are echoing down the generations. Children are amazing mimics. They learn much of what they know implicitly long before they can use language, and they imitate the bad along with the good. It is for this reason that it has been said that the sins of the fathers will be visited on the children to the third and fourth generation (Numbers 14:18). Hope, of course, can drive us through the pain of negotiation, but hope is not enough. You need desperation, as well, and that is part of the utility of “till death do us part.” You are stuck with each other, if you are serious—and if you are not serious, you are still a child. That is the point of the vow: the possibility of mutual salvation, or the closest you can manage here on Earth. In a truly mature marriage, if your health holds out, you are there for the aforementioned sixty years, like Moses in the desert searching for the Promised Land, and there is plenty of trouble that must be worked through—all of it—before peace might be established. So, you grow up when you marry, and you aim for peace as if your soul depends upon it (and perhaps that is more serious than your life depending on it), and you make it work or you suffer miserably. You will be tempted by avoidance, anger, and tears, or enticed to employ the trapdoor of divorce so that you will not have to face what must be faced. But your failure will haunt you while you are enraged, weeping, or in the process of separating, as it will in the next relationship you stumble into, with all your unsolved problems intact and your negotiating skills not improved a whit. You can keep the possibility of escape in the back of your mind. You can avoid the commitment of permanence. But then you cannot achieve the transformation, which might well demand everything you can possibly muster. The difficulty, however, that is implicit in the negotiation carries with it a tremendous promise, which is part of a radically successful life: You could have a marriage that works. You could make it work. That is an achievement—a tangible, challenging, exceptional, and unlikely achievement. There are not many genuine achievements of that magnitude in life; a number as small as four is a reasonable estimate. Maybe, if you strive for it, you have established a solid marriage. That is achievement one. Because of that, you have founded a solid and reliable, honest and playful home into which you could dare bring children. Then you can have kids, and with a solid marriage that can work out for you. That is achievement two. Then you have brought upon yourself more of the responsibility that will demand the best from you. Then you will have new relationships of the highest quality, if you are fortunate and careful. Then you will have grandchildren so that you are surrounded by new life when yours begins to slip away. In our culture, we live as if we are going to die at thirty. But we do not. We live a very long time, but it is also all over in a flash, and it should be that you have accomplished what human beings accomplish when they live a full life, and marriage and children and grandchildren and all the trouble and heartbreak that accompanies all of that is far more than half of life. Miss it at your great peril. You meet people, usually young, unwise but laden with the unearned cynicism that substitutes for wisdom in youth, and they say, categorically—even pridefully—“I do not want children.” Plenty of nineteen-year-olds say that, and that is acceptable, in some sense, because they are nineteen, and they have time, and what do they know at nineteen, anyway? And some twenty-seven-year-olds say that, but not so many, particularly if they are female and the least bit honest with themselves. And some forty-five-year-olds say the same thing, in the past tense, and some of them, perhaps, are telling the truth; but most are celebrating closing the barn door after the cattle have bolted. No one will speak the truth about this. To note outright that we lie to young women, in particular, about what they are most likely to want in life is taboo in our culture, with its incomprehensibly strange insistence that the primary satisfaction in the typical person’s life is to be found in career (a rarity in itself, as most people have jobs, not careers). But it is an uncommon woman, in my clinical and general professional experience, regardless of brilliance or talent, training, discipline, parental desire, youthful delusion, or cultural brainwashing who would not perform whatever sacrifice necessary to bring a child into the world by the time she is twenty-nine, or thirty-five, or worse, forty. Here is a pathway to misery I would strongly recommend avoiding, aimed primarily at the women who read this book (although wise boyfriends and husbands should take equal note). Decide that you want children when you are twenty-nine or thirty, and then be unable to have them: I would not recommend that. You will not recover. We are too fragile to play around with what life might offer us. Everyone thinks, when they are young and do not know any better, “Well, pregnancy can be taken for granted.” That is only true if you absolutely do not want and should not have a child, and you have sex in the backseat of a car when you are fifteen. Then, for sure, you will find yourself in trouble. But a successful pregnancy is not a foregone conclusion, not by any stretch of the imagination. You can push trying for children to the older end of that spectrum—and many people are encouraged or encourage themselves to do exactly that—but up to 30 percent of couples experience trouble becoming pregnant. You encounter something similar—that is, the incaution about what life will and will not offer— when people whose marriages have stagnated begin to develop the delusion that a romantic affair will address their unmet needs. When I had clients considering such a move—or perhaps involved in an affair, currently—I tried to bring them back down to earth. “Let us think it through, all the way. Not just for this week, or this month. You are fifty. You have this twenty-four-year-old, and she is willing to break up your marriage. What is she thinking? Who must she be? What does she know?” “Well, I am really attracted to her.” “Yes, but she has a personality disorder. Seriously, because what the hell is she doing with you, and why is she willing to break up this marriage?” “Well, she does not care if I stay married.” “Oh, I see. So, she does not want to have an actual relationship with someone, with any degree of long-term permanency. Somehow that is going to work out well for you, is it? Just think about that. It is going to be a little rough on your wife. A lot of lies are going to go along with that. You have children—how are they going to respond when all this comes out, as it most certainly will? And what do you think about the ten years in court that are now beckoning, that are going to cost you a third of a million dollars and put you in a custody battle that will occupy all your time and attention?” I have seen people who were in custody battles who would seriously have preferred cancer. It is no joke to have your arm caught in the dangerous machinery of the courts. You spend much of the time truly wishing you were dead. So that is your “affair,” for God’s sake. It is even more delusional than that, because, of course, if you are married to someone, you often see them at their worst, because you have to share the genuine difficulties of life with them. You save the easy parts for your adulterous partner: no responsibility, just expensive restaurants, exciting nights of rule breaking, careful preparation for romance, and the general absence of reality that accompanies the privilege of making one person pay for the real troubles of existence while the other benefits unrealistically from their absence. You do not have a life with someone when you have an affair with them. You have an endless array of desserts (at least in the beginning), and all you have to do is scoop the whipped cream off the top of each of them and devour it. That is it. You see each other under the best possible conditions, with nothing but sex in your minds and nothing else interfering with your lives. As soon as it transforms from that into a relationship that has any permanency, a huge part of the affair immediately turns right back into whatever it was that was bothering you about your marriage. An affair is not helpful, and people end up horribly hurt. Particularly children—and it is to them you owe primary allegiance. I am not trying to be unreasonably categorical about marriage and family. You cannot expect every social institution to work out for everyone. Sometimes, you have married someone who is a psychopathic brute, a congenital and incorrigible liar, a criminal, an alcoholic, a sadist (and maybe all five at once). Then you must escape. But that is not a trapdoor. That is a catastrophe, like a hurricane, and you should move out of its path. You might be tempted to conclude: “Well, how about we live together, instead of getting married? We will try each other out. It is the sensible thing to do.” But what exactly does it mean, when you invite someone to live with you, instead of committing yourself to each other? And let us be appropriately harsh and realistic about our appraisal, instead of pretending we are taking a used car for a test jaunt. Here is what it means: “You will do, for now, and I presume you feel the same way about me. Otherwise we would just get married. But in the name of a common sense that neither of us possesses we are going to reserve the right to swap each other out for a better option at any point.” And if you do not think that is what living together means—as a fully articulated ethical statement—see if you can formulate something more plausible. You might think, “Look, Doc, that is pretty cynical.” So why not we consider the stats, instead of the opinion of arguably but not truly old-fashioned me? The breakup rate among people who are not married but are living together—so, married in everything but the formal sense—is substantially higher than the divorce rate among married couples.[76] And even if you do get married and make an honest person, so to speak, of the individual with whom you cohabited, you are still much more rather than less likely to get divorced than you would be had you never lived together initially.[77] So the idea of trying each other out? Sounds enticing, but does not work. It is of course possible that people who are more likely to get divorced, for reasons of temperament, are also more likely to live together, before or without marriage, rather or in addition to the possibility that living together just does not work. It is no simple matter to disentangle the two causal factors. But it does not matter, practically. Cohabitation without the promise of permanent commitment, socially announced, ceremonially established, seriously considered, does not produce more robust marriages. And there is nothing good about that—particularly for children, who do much worse in single parent (generally male-absent) families.[78] Period. So, I just do not see it as a justifiable social alternative. And I say that as someone who lived with my wife before I married her. I am not innocent in this regard. But that does not mean I was right. And there is something else, and it is far from trivial. You just do not have that many chances in life to have an intimate relationship work out properly. Maybe it takes you two or three years to meet the potential Mr. or Ms. Right, and another two or three to determine if they are in fact who you think they are. That is five years. You get old a lot faster than you think you will, no matter how old you are now, and most of what you could do with your family—with marriage, children, and so forth—is from twentysomething to about thirty-five. How many good five-year chances do you therefore have? Three? Four, if you are fortunate? This means that your options decrease as you wait, rather than increase. If you are a widower, or a widow, and you must hit the dating scene when you are forty or fifty, so be it. You have been struck by tragedy, and that is life. But I have watched friends do it, and it is not a fate I would casually wish on anyone I loved. Let us continue to be reasonable about this: All sixteen- to eighteen-year-olds have much in common. They are unformed. They are malleable. That is not an insult. It is just a fact. It is also why they can go off to college and make a lifelong friend (no cynicism whatsoever intended) from a roommate within a single semester. By the time you are in your midforties, however—if you have lived at all—you have become somewhat of a singular and unique person. I have known people I met at that time of my life for a decade or more whom I still seem to consider new acquaintances. That is a pure function of the complexity of increasing age. And that is mere friendship, not love—not a joint life and perhaps even the bringing together of two disparate families. And so you have your marriage and your children, and that is working out well because you are stubborn and sufficiently terrified of the hell that awaits anyone who fails to negotiate for peace and make the sacrifices necessary to establish it. You are undoubtedly more prepared now for your career— or more likely, your job. That is the third of the four achievements you might manage, with good fortune and an undaunted spirit, in the brief flash of your existence. You have learned how to establish productive harmony in the close confines of your most intimate and private relationships, and some of that wisdom spills over into your workplace. You are a mentor for younger people, a helpful peer and reliable subordinate, and instead of the hash you could so easily make of the place you inhabit, you improve it. And if everyone did that the world would be a much less tragic and unhappy place. Maybe it would even be a self-evidently good place. And perhaps you learn how to make good use of your time away from family and work—your leisure—and you make that meaningful and productive. And that is the fourth of the four achievements—and one, like the others, that can grow. Perhaps you get better and better at such things so that you can work on solving more and more difficult problems, and become a credit, in your own way, to the spirit of humanity itself. And that is life. Back to marriage. How do you plan and diligently maintain the romance in your relationship? Well, you have to decide: “Do you want some romance in your life or not?” If you really think about it, without resentment—without the joy of depriving your partner, now alienated, of the pleasure that might come with such an attempt—the answer is generally yes. Sexual romance: the adventure, pleasure, intimacy, and excitement people fantasize about experiencing, when they are feeling in need of a touch of the divine. You want that. The joys of life are rare and precious, and you do not want to forsake them without due cause. How are you going to accomplish that? With luck, it will happen between you and someone you like; with better luck, and sufficient commitment, it will happen between you and someone you love. Little about this is easy. If you set up a household with someone, you are going to have to do an awful lot of negotiation to keep both “like” and “love” alive.RULE XI: Do not allow yourself to become resentful, deceitful, or arrogant.
Resentment
Why do you and others fall prey to resentment—that terrible hybrid emotional state, an admixture of anger and self-pity, tinged, to various degrees, with narcissism and the desire for revenge? Once you understand the world as a dramatic forum, and you have identified the major players, the reasons become clear. You are resentful because of the absolute unknown and its terrors, because nature conspires against you, because you are a victim of the tyrannical element of culture, and because of the malevolence of yourself and other individuals. That is reason enough. It does not make your resentment appropriate, but it certainly makes the emotion understandable. Note: "Conspiracy" and "Victim" are the keywords here.Deceit and Arrogance
There appear to be two broad forms of deceit: sins of commission, the things you do knowing full well they are wrong; and sins of omission, which are things you merely let slide—you know you should look at, do, or say something, but you do not. Maybe your business partner is a little bit crooked with the books, and you decide that you are just not going to audit them; or you turn a blind eye to your own misbehavior; or you fail to investigate the misdeeds of a child, adolescent, or your partner in your household. Instead, you just let it go. What motivates these kinds of deceit? We lie, outright—the sin of commission—knowing full well that we are doing so, to make things easier for us, in theory, regardless of the effect upon other people. We try to tip the world in our own personal favor. We try to gain an edge. We endeavor to avoid a just punishment that is coming our way—often by passing it to others. We commit the sin of omission, alternatively (and perhaps more subtly), in the belief that what we are avoiding will just go away, which it seldom does. We sacrifice the future to the present, frequently suffering the slings and arrows of outraged conscience for doing so, but continuing, rigidly and stubbornly, in any case.RULE XII: Be grateful in spite of your suffering.
Down can define up.
I have been searching for decades for certainty. It has not been solely a matter of thinking, in the creative sense, but of thinking and then attempting to undermine and destroy those thoughts, followed by careful consideration and conservation of those that survive. It is identification of a path forward through a swampy passage, searching for stones to stand on safely below the murky surface. However, even though I regard the inevitability of suffering and its exaggeration by malevolence as unshakable existential truths, I believe even more deeply that people have the ability to transcend their suffering, psychologically and practically, and to constrain their own malevolence, as well as the evils that characterize the social and the natural worlds. Human beings have the capacity to courageously confront their suffering—to transcend it psychologically, as well as to ameliorate it practically. This is the most fundamental twin axiom of psychotherapy, regardless of school of thought, as well as key to the mystery of human success and progress across history itself. If you confront the limitations of life courageously, that provides you with a certain psychological purpose that serves as an antidote to the suffering. The fact of your voluntary focus on the abyss, so to speak, indicates to yourself at the deepest of levels that you are capable of taking on without avoidance the difficulties of existence and the responsibility attendant upon that. That mere act of courage is deeply reassuring at the most fundamental levels of psychological being. It indicates your capability and competence to those deep, ancient, and somewhat independent biological and psychological alarm systems that register the danger of the world. But the utility of such confrontation is by no means merely psychological, as important as that is. It is the appropriate pragmatic approach as well: If you act nobly—a word that is very rarely used now, unfortunately—in the face of suffering, you can work practically and effectively to ameliorate and rectify your own and other people’s misery, as such. You can make the material world—the real world—better (or at least stop it from getting worse). The same goes for malevolence: you can constrain that within yourself. When you are about to say something, your conscience might (often does) inform you, noting, “That is not true.” It might present itself as an actual voice (internal, of course) or a feeling of shame, guilt, weakness, or other inner disunity—the physiological consequence of the duality of psyche you are manifesting. You then have the opportunity to cease uttering those words. If you cannot tell the truth, you can at least not consciously lie.[85] That is part of the constraint of malevolence. That is something within our grasp. Beginning to cease knowingly lying is a major step in the right direction. We can constrain our suffering, and we can face it psychologically. That makes us courageous. Then we can ameliorate it practically, because that is what we do when we care for ourselves and other people. There seems to be almost no limit to that. You can genuinely and competently come to care for yourself and your family. You can then extend that out into the broader community. Some people become unbelievably good at that. People who work in palliative care constitute a prime example. They work continually, caring for people who are suffering and dying, and they lose some of those people every day. But they manage to get out of bed every morning, go to work, and face all that pain, tragedy, and death. They make a difference under virtually impossible circumstances. It is for such reasons and because of such examples—watching people confront the existential catastrophe of life forthrightly and effectively—that I am more optimistic than pessimistic, and that I believe that optimism is, fundamentally, more reliable than pessimism. To come to such a conclusion, and then to find it unshakable, is a good example of how and why it may be necessary to encounter the darkness before you can see the light. It is easy to be optimistic and naive. It is easy for optimism to be undermined and demolished, however, if it is naive, and for cynicism to arise in its place. But the act of peering into the darkness as deeply as possible reveals a light that appears unquenchable, and that is a profound surprise, as well as a great relief. The same holds true for the issue of gratitude. I do not believe you can be appropriately grateful or thankful for what good you have and for what evil has not befallen you until you have some profound and even terrifying sense of the weight of existence. You cannot properly appreciate what you have unless you have some sense not only of how terrible things could be, but of how terrible it is likely for things to be, given how easy it is for things to be so. This is something that is very much worth knowing. Otherwise you might find yourself tempted to ask, “Why would I ever look into the darkness?” But we seem positively drawn to look. We are fascinated by evil. We watch dramatic representations of serial killers, psychopaths, and the kings of organized crime, gang members, rapists, contract killers, and spies. We voluntarily frighten and disgust ourselves with thrillers and horror films—and it is more than prurient curiosity. It is the development of some understanding of the essentially moral structure of human existence, of our suspension between the poles of good and evil. The development of that understanding is necessary; it places a down below us and an up above us, and orients us in perception, motivation, and action. It protects us, as well. If you fail to understand evil, then you have laid yourself bare to it. You are susceptible to its effects, or to its will. If you ever encounter someone who is malevolent, they have control over you in precise proportion to the extent that you are unwilling or unable to understand them. Thus, you look in dark places to protect yourself, in case the darkness ever appears, as well as to find the light. There is real utility in that.
Tuesday, January 10, 2023
Beyond Order (12 more rules for life) by Jordan Peterson
Saturday, January 7, 2023
Meet Isaiah Ongera (from Kenya, Jan 2023)
My name is Isaiah Ongera Mandi. I was named after my great grandfather’s death. I was born on 15th March, 2003 therefore I am 20 years old, in Bigege, Nyamira county in Kenya. I am born in a family of four where I’m the second born after my sister. I currently live with my parents at there home stead. My parents are self-employed as they work at our small piece of land practicing mixed farming. There are many attributes of myself that I admire. I'm a creative, open minded person which helps me where I need to open my brain to see the bigger picture of my life. The attribute that I most admire in myself is my aspiration to solve problems affronting the globe... which bolstered me to undertake information technology course after completing my form four course. I try to work hard, and do any work with as much diligence as I can to get the best results. I do well around a group of people and I do have excellent social skills, however I find it easier to pay attention to work when am alone. I am a person who is positive about every aspect of life which makes me describe and believe like life is an art and I am the art of my life. There are many things I like to do, to see and to experience. I like to investgate, read, write and do research on things related to technology. Some of my strengths is my calmness, independence, ability not to give up in whatever I'm doing in contempt of challenges and difficults which come on the the way. The intelligence of keeping a smile on my face no matter what I am going through is my second best gift I am proud of. I enjoy connecting with people and I am open to interesting work offers from clients in India.Tags: Kenya,
Abandon Ideology (A lesson in Politics and Psychology)
Tags: Book Summary,Psychology,Rule 6: Abandon Ideology
Chapter 6 from the book by Jordan B. Peterson: Beyond Order. 12 More Rules for Life (2021) After I published my last book, my wife, Tammy, and I embarked on a lengthy speaking tour throughout the English-speaking world and a good part of Europe, particularly in the north. Most of the theaters I spoke at were old and beautiful, and it was a delight to be in buildings with such rich architectural and cultural histories, where so many of the bands we loved had played, and where other performing artists had had their great moments. We booked 160 theaters—generally with a capacity of about 2,500 to 3,000 people (although there were smaller venues in Europe, and larger in Australia). I was—and am—struck to the core by the fact that there was such an extensive audience for my lectures—and that we found that audience seemingly everywhere. The same surprise extends to my YouTube and podcast appearances—on my own channels, in interviews on others, and in the innumerable clips that people have voluntarily cut from my longer talks and discussions with journalists. These have been watched or listened to hundreds of millions of times. And finally, there is the aforementioned book, which will have sold something like four million copies in English by the time the present volume is published, and which will be translated into fifty additional languages, assuming matters continue as they are now. It is not at all easy to know what to think about finding myself with an audience like that. What is going on? Any sensible person would be taken aback—to put it mildly—by all of this. It seems that my work must be addressing something that is missing in many people’s lives. Now, as I mentioned previously, I am relying for much of my content on the ideas of great psychologists and other thinkers, and that should count for something. But I have also been continually considering what else more specific (if anything) might be attracting people’s attention, and have been relying on two sources of information to try to determine exactly that. The first is the response I get directly from individuals themselves, when I meet them in the immediate aftermath of one of my lectures or when they stop me on the street, in airports, cafés, or other public places. In one midwestern American city (I think it might have been Louisville), a young man met me after my lecture and said, “Quick story. Two years ago, I was released from prison. Homeless. Broke. I started listening to your lectures. Now I have a full-time job, and I own my apartment, and my wife and I just had our first child—a daughter. Thank you.” And the “thank you” was accompanied with direct eye contact and a firm handshake, and the story was told in the voice of conviction. And people tell me very similar stories on the street, often in tears, although the one I just related was perhaps a bit more extreme than the average tale. They share very private good news (the kind you share only with people to whom you can safely tell such things). And I feel greatly privileged to be one of those people, although it is emotionally demanding to be the recipient of continual personal revelations, regardless (or maybe even because) of the fact that they are so positive. I find it heart-wrenching to see how little encouragement and guidance so many people have received, and how much good can emerge when just a little more is provided. “I knew you could do it” is a good start, and goes a long way toward ameliorating some of the unnecessary pain in the world. So, that is one form of story that I hear, continually, in many variants. When we meet, one on one, people also tell me that they enjoy my lectures and what I have written because what I say and write provides them with the words they need to express things they already know, but are unable to articulate. It is helpful for everyone to be able to represent explicitly what they already implicitly understand. I am frequently plagued with doubts about the role that I am playing, so the fact that people find my words exist in accordance with their deep but heretofore unrealized or unexpressed beliefs is reassuring, helping me maintain faith in what I have learned and thought about and have now shared so publicly. Helping people bridge the gap between what they profoundly intuit but cannot articulate seems to be a reasonable and valuable function for a public intellectual. And then there is the final piece of information bearing on whatever it is that I am accomplishing. I have garnered it as a direct consequence of the live lectures that I have had so many opportunities to deliver. It is a privilege and a gift to be able to talk repeatedly to large groups of people. It provides a real-time opportunity to judge the zeitgeist, the spirit of the times. It also allows me to formulate and immediately test new ideas for their communicability and their ability to grip attention and, thereby, to judge their quality—at least in part. This occurs during the talk when I attend to how the audience responds. In 12 Rules for Life, Rule 9: Assume that the person you are listening to might know something you do not, I suggest that when speaking to a large group you should nonetheless always be attending to specific individuals—the crowd is somewhat of an illusion. However, you can augment your individual- focused visual attention by simultaneously listening to the entire group, so that you hear them rustling around, laughing, coughing, or whatever they happen to be doing, while you concentrate on perceiving specific individuals. What you want to see from the person you are facing is rapt attention. What you want to hear from the crowd is dead silence. You want to hear nothing. Achieving that means your listeners are not distracted by everything they could be thinking about while in attendance. If you are an audience member at a performance, and you are not completely enthralled by the content, you become preoccupied by some slight physical discomforts, and shift from place to place. You become aware of your own thoughts. You begin to think about what you need to do tomorrow. You whisper something to the person beside you. That all adds up to discontent in the audience, and audible noise. But if you, as speaker, are positioned properly on stage, physically and spiritually, then everybody’s attention will be focused with laser-like intensity on whatever you are saying, and no one will make a sound. In this manner, you can tell what ideas have power. While watching and listening in the way I just described to all the gatherings I have spoken to, I became increasingly aware that the mention of one topic in particular brought every audience (and I mean that without exception) to a dead-quiet halt: responsibility—the very topic we made central in this book in Rule IV: Notice that opportunity lurks where responsibility has been abdicated. That response was fascinating—and not at all predictable. Responsibility is not an easy sell. Parents have been striving forever to make their kids responsible. Society attempts the same thing, with its educational institutions, apprenticeships, volunteer organizations, and clubs. You might even consider the inculcation of responsibility the fundamental purpose of society. But something has gone wrong. We have committed an error, or a series of errors. We have spent too much time, for example (much of the last fifty years), clamoring about rights, and we are no longer asking enough of the young people we are socializing. We have been telling them for decades to demand what they are owed by society. We have been implying that the important meanings of their lives will be given to them because of such demands, when we should have been doing the opposite: letting them know that the meaning that sustains life in all its tragedy and disappointment is to be found in shouldering a noble burden. Because we have not been doing this, they have grown up looking in the wrong places. And this has left them vulnerable: vulnerable to easy answers and susceptible to the deadening force of resentment. What about the unfolding of history has left us in this position? How has this vulnerability, this susceptibility, come about?Perhaps he is only sleeping.
In the last quarter of the nineteenth century, the German philosopher Friedrich Nietzsche famously announced “God is dead.” This utterance has become so famous that you can even see it scribbled on the walls of public bathrooms, where it often takes the following form: “God is dead” —Nietzsche. “Nietzsche is dead” —God. Nietzsche did not make this claim in a narcissistic or triumphant manner. The great thinker’s opinion stemmed from his fear that all the Judeo-Christian values serving as the foundation of Western civilization had been made dangerously subject to casual rational criticism, and that the most important axiom upon which they were predicated—the existence of a transcendent, all-powerful deity—had been fatally challenged. Nietzsche concluded from this that everything would soon fall apart, in a manner catastrophic both psychologically and socially. It does not require a particularly careful reader to note that Nietzsche described God, in The Gay Science, as the “holiest and mightiest of all that the world has yet owned,” and modern human beings as “the murderers of all murderers.” These are not the sorts of descriptions you might expect from a triumphant rationalist celebrating the demise of superstition. It was instead a statement of absolute despair. In his other works, particularly in The Will to Power, Nietzsche describes what would occur in the next century and beyond because of this murderous act. He prophesied (and that is the correct word for it) that two major consequences would arise—apparent opposites, although each linked inextricably and causally together—and both associated with the death of traditional ritual, story, and belief. As the purpose of human life became uncertain outside the purposeful structure of monotheistic thought and the meaningful world it proposed, we would experience an existentially devastating rise in nihilism, Nietzsche believed. Alternatively, he suggested, people would turn to identification with rigid, totalitarian ideology: the substitute of human ideas for the transcendent Father of All Creation. The doubt that undermines and the certainty that crushes: Nietzsche’s prognostication for the two alternatives that would arise in the aftermath of the death of God. The incomparable Russian novelist Fyodor Dostoyevsky addressed the same question as Nietzsche—at about the same time—in his masterwork The Possessed (alternatively known as Demons or The Devils). The protagonist in that novel, Nikolai Stavrogin, is wed to the same ideals that eventually birthed revolutionary communism, although he lives his fictional life decades before the full-fledged turmoil began in what became the Soviet Union. The appearance of these ideals was not a positive development, in Dostoevsky’s view. He could see that the adoption of a rigid, comprehensive utopian ideology, predicated on a few apparently self-evident axioms, presented a political and spiritual danger with the potential to far exceed in brutality all that had occurred in the religious, monarchical, or even pagan past. Dostoyevsky, like Nietzsche, foresaw that all of this was coming almost fifty years (!) before the Leninist Revolution in Russia. That incomprehensible level of prophetic capacity remains a stellar example of how the artist and his intuition brings to light the future far before others see it. Nietzsche and Dostoevsky both foresaw that communism would appear dreadfully attractive—an apparently rational, coherent, and moral alternative to religion or nihilism—and that the consequences would be lethal. The former wrote, in his inimitably harsh, ironic, and brilliant manner, “In fact, I even wish a few experiments might be made to show that in socialistic society life denies itself, and itself cuts away its own roots. The earth is big enough and man is still unexhausted enough for a practical lesson of this sort and demonstratio ad absurdum—even if it were accomplished only by a vast expenditure of lives—to seem worthwhile to me.” The socialism Nietzsche referred to was not the relatively mild version later popular in Britain, Scandinavia, and Canada, with its sometimes genuine emphasis on the improvement of working-class life, but the full-blown collectivism of Russia, China, and a host of smaller countries. Whether we have truly learned the “practical lesson”—the demonstration of the absurdity of the doctrine—as a consequence of Nietzsche’s predicted “vast expenditure of lives” remains to be seen. Nietzsche appears to have unquestioningly adopted the idea that the world was both objective and valueless in the manner posited by the emergent physical sciences. This left him with a single remaining escape from nihilism and totalitarianism: the emergence of the individual strong enough to create his own values, project them onto valueless reality, and then abide by them. He posited that a new kind of man—the Ãœbermensch (the higher person or superman)—would be necessary in the aftermath of the death of God, so that society would not drift toward the opposing rocky shoals of despair and oversystematized political theorizing. Individuals who take this route, this alternative to nihilism and totalitarianism, must therefore produce their own cosmology of values. However, the psychoanalysts Freud and Jung put paid to that notion, demonstrating that we are not sufficiently in possession of ourselves to create values by conscious choice. Furthermore, there is little evidence that any of us have the genius to create ourselves ex nihilo—from nothing—particularly given the extreme limitations of our experience, the biases of our perceptions, and the short span of our lives. We have a nature—or, too often, it has us—and only a fool would now dare to claim that we have sufficient mastery of ourselves to create, rather than discover, what we value. We have the capacity for spontaneous revelatory experience—artistic, inventive, and religious. We discover new things about ourselves constantly, to our delight—and also to our dismay, as we are so often overcome by our emotions and motivations. We contend with our nature. We negotiate with it. But it is not at all obvious that the individual will ever be capable of bringing the new values that Nietzsche so fervently longed for into being. There are other problems with Nietzsche’s argument, as well. If each of us lives by our own created and projected values, what remains to unite us? This is a philosophical problem of central importance. How could a society of Ãœbermenschen possibly avoid being at constant odds with one another, unless there was something comparable about their created values? Finally, it is by no means obvious that any such supermen have ever come into existence. Instead, over the last century and a half, with the modern crisis of meaning and the rise of totalitarian states such as Nazi Germany, the USSR, and Communist China, we appear to have found ourselves in exactly the nihilistic or ideologically possessed state that Nietzsche and Dostoevsky feared, accompanied by precisely the catastrophic sociological and psychological consequences they foretold. It is also by no means self-evident that value, subjective though it appears to be, is not an integral part of reality, despite the undeniable utility of the scientific method. The central scientific axiom left to us by the Enlightenment—that reality is the exclusive domain of the objective—poses a fatal challenge to the reality of religious experience, if the latter experience is fundamentally subjective (and it appears to be exactly that). But there is something complicating the situation that seems to lie between the subjective and the objective: What if there are experiences that typically manifest themselves to one person at a time (as seems to be the case with much of revelation), but appear to form a meaningful pattern when considered collectively? That indicates something is occurring that is not merely subjective, even though it cannot be easily pinned down with the existing methods of science. It could be, instead, that the value of something is sufficiently idiosyncratic—sufficiently dependent on the particularities of time, place, and the individual experiencing that thing—that it cannot be fixed and replicated in the manner required for it to exist as a scientific object. This does not mean, however, that value is not real: It means only that it is so complex that it cannot yet and may never fit itself within the scientific worldview. The world is a very strange place, and there are times when the metaphorical or narrative description characteristic of culture and the material representation so integral to science appear to touch, when everything comes together—when life and art reflect each other equally. The psyche—the soul—that produces or is the recipient of such experiences appears incontrovertibly real: the proof lying not least in our actions. We all axiomatically assume the reality of our individual existences and conscious experiences, and we extend the same courtesy to others (or else). It is by no means unreasonable to suggest that such existence and experience has a deep underlying biological and physical structure. Those with a psychoanalytic bent certainly assume so, as do many who study biological psychology, particularly if they focus on motivation and emotion. That structure, accepted as a given by scientists and by the general population in equal measure, appears to manifest religious experience as part of its basic function—and that religious function has enough commonality across people to make us at least understand what “religious experience” means—particularly if we have had a taste of it at some point in life. What does that imply? It might be that the true meaning of life is available for discovery, if it can be discovered at all, by each individual, alone—although in communication with others, past and present. It may well be, therefore, that the true meaning of life is not to be found in what is objective, but in what is subjective (but still universal). The existence of conscience, for example, provides some evidence for that, as does the fact that religious experiences can reliably be induced chemically, as well as through practices such as dancing, chanting, fasting, and meditating. Additionally, the fact that religious ideas are capable of uniting vast numbers of people under a single moral umbrella (although such ideas can divide across sects, as well) also indicates something universal calling from within. Why do we so easily assume that nothing about that is real, given its apparent commonality and necessity—given, as well, the near certainty that the capacity for valuing is an ancient evolved function, selected for by the very reality we are attempting to define and understand? We have seen the consequences of the totalitarian alternatives in which the collective is supposed to bear the burdens of life, lay out the proper pathway, and transform the terrible world into the promised utopia. The communists produced a worldview that was attractive to fair-minded people, as well as those who were envious and cruel. Perhaps communism may even have been a viable solution to the problems of the unequal distribution of wealth that characterized the industrial age, if all of the hypothetically oppressed were good people and all of the evil was to be found, as hypothesized, in their bourgeoisie overlords. Unfortunately for the communists, a substantial proportion of the oppressed were incapable, unconscientious, unintelligent, licentious, power mad, violent, resentful, and jealous, while a substantial proportion of the oppressors were educated, able, creative, intelligent, honest, and caring. When the frenzy of dekulakization swept through the newly established Soviet Union, it was vengeful and jealous murderers who were redistributing property, while it was competent and reliable farmers, for the most part, from whom it was violently taken. One unintended consequence of that “redistribution” of good fortune was the starvation of six million Ukrainians in the 1930s, in the midst of some of the most fertile land in the world. The other major villains of the twentieth century, Germany’s National Socialists, were, of course, also powerful and dangerous ideologues. It has been suggested that Hitler’s acolytes were inspired by Nietzsche’s philosophy. This claim may hold some truth in a perverse manner, as they were certainly trying to create their own values, although not as the individuals whose development the philosopher promoted. It is more reasonable to say that Nietzsche identified the cultural and historical conditions that made the rise to influence of ideas akin to those promoted by the Nazis extremely likely. The Nazis were trying to create a post-Christian, postreligious perfect man, the ideal Aryan, and certainly formulated that ideal in a manner not in accordance with the dictates of either Judaism or Christianity. Thus, the perfect Aryan could be and certainly was conceptualized by the Nazis as a “higher man.” This does not mean that the Nazi ideal as postulated bore any resemblance to the Nietzschean ideal. Quite the contrary: Nietzsche was a fervent admirer of individuality and would have considered the idea of the higher man as state creation both absurd and abhorrent.The Fatal Attraction of the False Idol
Consider those who have not gone so far as to adopt the discredited ideologies of the Marxist-Leninists and the Nazis, but who still maintain faith in the commonplace isms characterizing the modern world: conservatism, socialism, feminism (and all manner of ethnic-and-gender-study isms), postmodernism, and environmentalism, among others. They are all monotheists, practically speaking—or polytheistic worshippers of a very small number of gods. These gods are the axioms and foundational beliefs that must be accepted, a priori, rather than proven, before the belief system can be adopted, and when accepted and applied to the world allow the illusion to prevail that knowledge has been produced. The process by which an ism system can be generated is simple in its initial stages but baroque enough in its application to mimic (and replace) actual productive theorizing. The ideologue begins by selecting a few abstractions in whose low-resolution representations hide large, undifferentiated chunks of the world. Some examples include “the economy,” “the nation,” “the environment,” “the patriarchy,” “the people,” “the rich,” “the poor,” “the oppressed,” and “the oppressors.” The use of single terms implicitly hypersimplifies what are in fact extraordinarily diverse and complex phenomena (that masked complexity is part of the reason that the terms come to carry so much emotional weight). There are many reasons, for example, why people are poor. Lack of money is the obvious cause—but that hypothetical obviousness is part of the problem with ideology. Lack of education, broken families, crime-ridden neighborhoods, alcoholism, drug abuse, criminality and corruption (and the political and economic exploitation that accompanies it), mental illness, lack of a life plan (or even failure to realize that formulating such a plan is possible or necessary), low conscientiousness, unfortunate geographical locale, shift in the economic landscape and the consequent disappearance of entire fields of endeavor, the marked proclivity for those who are rich to get richer still and the poor to get poorer, low creativity/entrepreneurial interest, lack of encouragement—these are but a few of the manifold problems that generate poverty, and the solution to each (assuming that a solution exists) is by no means obviously the same. Nor are the villains hiding behind each putative and differentiable cause the same villains (assuming that there are even villains to be found). All such problems require careful, particularized analysis, followed by the generation of multiple potential solutions, followed by the careful assessment of those solutions to ensure that they are having their desired effect. It is uncommon to see any serious social problem addressed so methodically. It is also rare that the solutions generated, even by methodical process, produce the intended outcome. The great difficulty of assessing problems in sufficient detail to understand what is causing them, followed by the equally great difficulty of generating and testing particularized solutions, is sufficient to deter even the stouthearted, let us say, from daring to tackle a true plague of mankind. Since the ideologue can place him or herself on the morally correct side of the equation without the genuine effort necessary to do so validly, it is much easier and more immediately gratifying to reduce the problem to something simple and accompany it with an evildoer, who can then be morally opposed. After breaking the world into large, undifferentiated pieces, describing the problem(s) that characterize each division, and identifying the appropriate villains, the ism theorist then generates a small number of explanatory principles or forces (which may indeed contribute in some part to the understanding or existence of those abstracted entities). Then he or she grants to that small number primary causal power, while ignoring others of equal or greater importance. It is most effective to utilize a major motivational system or large-scale sociological fact or conjecture for such purposes. It is also 79good to select those explanatory principles for an unstated negative, resentful, and destructive reason, and then make discussion of the latter and the reason for their existence taboo for the ideologue and his or her followers (to say nothing of the critics). Next, the faux theorist spins a post-hoc theory about how every phenomenon, no matter how complex, can be considered a secondary consequence of the new, totalizing system. Finally, a school of thought emerges to propagate the methods of this algorithmic reduction (particularly when the thinker is hoping to attain dominance in the conceptual and the real worlds), and those who refuse to adopt the algorithm or who criticize its use are tacitly or explicitly demonized. Incompetent and corrupt intellectuals thrive on such activity, such games. The first players of a given game of this sort are generally the brightest of the participants. They weave a story around their causal principle of choice, demonstrating how that hypothetically primary motivational force profoundly contributed to any given domain of human activity. Sometimes this is even helpful, as such activity may shed light on how a motivation heretofore taboo to discuss or consider might play a larger role in affecting human behavior and perception than was previously deemed acceptable (this is what happened, for example, with Freud, and his emphasis on sex). Their followers, desperate to join a potentially masterable new dominance hierarchy (the old one being cluttered by its current occupants), become enamored of that story. While doing so, being less bright than those they follow, they subtly shift “contributed to” or “affected” to “caused.” The originator(s), gratified by the emergence of followers, start to shift their story in that direction as well. Or they object, but it does not matter. The cult has already begun. This kind of theorizing is particularly attractive to people who are smart but lazy. Cynicism serves as an aid, too, as does arrogance. The new adherents will be taught that mastering such a game constitutes education, and will learn to criticize alternative theories, different methods, and increasingly, even the idea of fact itself. If an impenetrable vocabulary accompanies the theory, so much the better. It will then take potential critics some valuable time even to learn to decode the arguments. And there is a conspiratorial aspect that rapidly comes to pervade the school where such “education” occurs, and where such activity is increasingly all that is permitted: Do not criticize the theory—and do not get singled out. Do not become unpopular. Even: Do not receive a bad grade, or a poor review, for expressing a taboo opinion (and even when this does not occur in practice, the fear that it might keeps many students and professors, or employees and employers, in check). Freud, as we noted, attempted to reduce motivation to sexuality, to libido. The same can be done quite effectively by anyone sufficiently literate, intelligent, and verbally fluent. This is because “sexuality” (like any multifaceted single term) can be defined as tightly or as loosely as necessary by those who use it for comprehensively explanatory purposes. No matter how defined, sex is a crucially important biological phenomenon—key to complex life itself—and its influence may therefore be genuinely detected or plausibly invented in any important field of endeavor and then exaggerated (while other factors of significant import are diminished in importance). In this manner, the single explanatory principle can be expanded indefinitely, in keeping with the demands placed upon it. Marx did the same thing when he described man in a fundamentally economic, class-based manner, and history as the eternal battleground of bourgeoisie and proletariat. Everything can be explained by running it through a Marxist algorithm. The wealthy are wealthy because they exploit the poor. The poor are poor because they are exploited by the wealthy. All economic inequality is undesirable, unproductive, and a consequence of fundamental unfairness and corruption. There is, of course—as in the case of Freud—some value in Marx’s observations. Class is an important element of social hierarchies, and tends to maintain itself with a certain stability across time. Economic well-being, or the lack thereof, is of crucial significance. And the damnable fact of the Pareto distribution[42] —the tendency of those who have more to get more (which seems to apply in any economic system)—does mean that 80wealth accumulates in the hands of a minority of people. The people who make up that minority do change substantively, regardless of the aforementioned class stability, and that is a crucial point, but the fact that the comparatively rich are always a minority—and a small one, at that—seems dismally immutable. Regardless of its hypothetical virtues, however, the implementation of Marxism was a disaster everywhere it was attempted—and that has motivated attempts by its unrepentant would-be present-day adherents to clothe its ideas in new garb and continue forward, as if nothing of significance has changed. Thinkers powerfully influenced by Marx and overwhelmingly influential in much of the academy today (such as Michel Foucault and Jacques Derrida) modified the Marxist simplification essentially by replacing “economics” with “power”—as if power were the single motivating force behind all human behavior (as opposed, say, to competent authority, or reciprocity of attitude and action). Ideological reduction of that form is the hallmark of the most dangerous of pseudo-intellectuals. Ideologues are the intellectual equivalent of fundamentalists, unyielding and rigid. Their self-righteousness and moral claim to social engineering is every bit as deep and dangerous. It might even be worse: ideologues lay claim to rationality itself. So, they try to justify their claims as logical and thoughtful. At least the fundamentalists admit devotion to something they just believe arbitrarily. They are a lot more honest. Furthermore, fundamentalists are bound by a relationship with the transcendent. What this means is that God, the center of their moral universe, remains outside and above complete understanding, according to the fundamentalist’s own creed. Right-wing Jews, Islamic hard-liners, and ultra-conservative Christians must admit, if pushed, that God is essentially mysterious. This concession provides at least some boundary for their claims, as individuals, to righteousness and power (as the genuine fundamentalist at least remains subordinate to Something he cannot claim to totally understand, let alone master). For the ideologue, however, nothing remains outside understanding or mastery. An ideological theory explains everything: all the past, all the present, and all the future. This means that an ideologue can consider him or herself in possession of the complete truth (something forbidden to the self-consistent fundamentalist). There is no claim more totalitarian and no situation in which the worst excesses of pride are more likely to manifest themselves (and not only pride, but then deceit, once the ideology has failed to explain the world or predict its future). The moral of the story? Beware of intellectuals who make a monotheism out of their theories of motivation. Beware, in more technical terms, of blanket univariate (single variable) causes for diverse, complex problems. Of course, power plays a role in history, as does economics. But the same can be said of jealousy, love, hunger, sex, cooperation, revelation, anger, disgust, sadness, anxiety, religion, compassion, disease, technology, hatred, and chance—none of which can definitively be reduced to another. The attraction of doing so is, however, obvious: simplicity, ease, and the illusion of mastery (which can have exceptionally useful psychological and social consequences, particularly in the short term)—and, let us not forget, the frequent discovery of a villain, or set of villains, upon which the hidden motivations for the ideology can be vented.Ressentiment
Ressentiment—hostile resentment—occurs when individual failure or insufficient status is blamed both on the system within which that failure or lowly status occurs and then, most particularly, on the people who have achieved success and high status within that system. The former, the system, is deemed by fiat to be unjust. The successful are deemed exploitative and corrupt, as they can be logically read as undeserving beneficiaries, as well as the voluntary, conscious, self-serving, and immoral supporters, if the system is unjust. Once this causal chain of thought has been accepted, all attacks on the successful can be construed as morally justified attempts at establishing justice—rather than, say, manifestations of envy and covetousness that might have traditionally been defined as shameful. There is another typical feature of ideological pursuit: the victims supported by ideologues are always innocent (and it is sometimes true that victims are innocent), and the perpetrators are always evil (evil perpetrators are also not in short supply). But the fact that there exist genuine victims and perpetrators provides no excuse to make low-resolution, blanket statements about the global locale of blameless victimization and evil perpetration—particularly of the type that does not take the presumed innocence of the accused firmly into account. No group guilt should be assumed —and certainly not of the multigenerational kind. It is a certain sign of the accuser’s evil intent, and a harbinger of social catastrophe. But the advantage is that the ideologue, at little practical costs, can construe him or herself both as nemesis of the oppressor and defender of the oppressed. Who needs the fine distinctions that determination of individual guilt or innocence demands when a prize such as that beckons? To take the path of ressentiment is to risk tremendous bitterness. This is in no small part a consequence of identifying the enemy without rather than within. If wealth is the problem at issue, for example, and the wealthy perceived as the reason for poverty and all the other problems of the world, then the wealthy become the enemy—indistinguishable, in some profound sense, from a degree of evil positively demonic in its psychological and social significance. If power is the problem, then those who have established any authority at all are the singular cause of the world’s suffering. If masculinity is the problem, then all males (or even the concept of male) must be attacked and vilified.33 Such division of the world into the devil without and the saint within justifies self-righteous hatred—necessitated by the morality of the ideological system itself. This is a terrible trap: Once the source of evil has been identified, it becomes the duty of the righteous to eradicate it. This is an invitation to both paranoia and persecution. A world where only you and people who think like you are good is also a world where you are surrounded by enemies bent on your destruction, who must be fought. It is much safer morally to look to yourself for the errors of the world, at least to the degree to which someone honest and free of willful blindness might consider. You are likely to be much more clear minded about what is what and who is who and where blame lies once you contemplate the log in your own eye, rather than the speck in your brother’s. It is probable that your own imperfections are evident and plentiful, and could profitably be addressed, as step one in your Redeemer’s quest to improve the world. To take the world’s sins onto yourself—to assume responsibility for the fact that things have not been set right in your own life and elsewhere—is part of the messianic path: part of the imitation of the hero, in the most profound of senses. This is a psychological or spiritual rather than a sociological or political issue. Consider the characters fabricated by second-rate crafters of fiction: they are simply divided into those who are good and those who are evil. By contrast, sophisticated writers put the divide inside the characters they create, so that each person becomes the locus of the eternal struggle between light and darkness. It is much more psychologically appropriate (and much less dangerous socially) to assume that you are the enemy—that it is your weaknesses and insufficiencies that are damaging the world—than to assume saintlike goodness on the part of you and your party, and to pursue the enemy you will then be inclined to see everywhere. It is impossible to fight patriarchy, reduce oppression, promote equality, transform capitalism, save the environment, eliminate competitiveness, reduce government, or to run every organization like a business. Such concepts are simply too low-resolution. The Monty Python comedy crew once offered satirical lessons for playing the flute: blow over one end and move your fingers up and down on the holes. True. But useless. The necessary detail is simply not there. Similarly, sophisticated large-scale processes and systems do not exist in a manner sufficiently real to render their comprehensive unitary transformation possible. The idea that they do is the product of twentieth-century cults. The beliefs of these cults are simultaneously naive and narcissistic, and the activism they are good and those who are evil. By contrast, sophisticated writers put the divide inside the characters they create, so that each person becomes the locus of the eternal struggle between light and darkness. It is much more psychologically appropriate (and much less dangerous socially) to assume that you are the enemy—that it is your weaknesses and insufficiencies that are damaging the world—than to assume saintlike goodness on the part of you and your party, and to pursue the enemy you will then be inclined to see everywhere. The activism these cults promote is the resentful and lazy person’s substitute for actual accomplishment. The single axioms of the ideologically possessed are gods, served blindly by their proselytizers. Like God, however, ideology is dead. The bloody excesses of the twentieth century killed it. We should let it go, and begin to address and consider smaller, more precisely defined problems. We should conceptualize them at the scale at which we might begin to solve them, not by blaming others, but by trying to address them personally while simultaneously taking responsibility for the outcome. Have some humility. Clean up your bedroom. Take care of your family. Follow your conscience. Straighten up your life. Find something productive and interesting to do and commit to it. When you can do all that, find a bigger problem and try to solve that if you dare. If that works, too, move on to even more ambitious projects. And, as the necessary beginning to that process . . . abandon ideology.
Subscribe to:
Posts (Atom)