Be patient toward all that is unsolved in your heart and try to love the questions themselves.... Live the questions now. Perhaps you will gradually, without noticing it, live along some distant day into the answer. —Rainer Maria Rilke, Letters to a Young Poet Sherry walked into my office with her shoulders slumped, her chin nearly touching her chest. Even before we spoke a word, her body was telling me that she was afraid to face the world. I also noticed that her long sleeves only partially covered the scabs on her forearms. After sitting down, she told me in a high-pitched monotone that she couldn’t stop herself from picking at the skin on her arms and chest until she bled. As far back as Sherry could remember, her mother had run a foster home, and their house was often packed with as many as fifteen strange, disruptive, frightened, and frightening kids who disappeared as suddenly as they arrived. Sherry had grown up taking care of these transient children, feeling that there was no room for her and her needs. “I know I wasn’t wanted,” she told me. “I’m not sure when I first realized that, but I’ve thought about things that my mother said to me, and the signs were always there. She’d tell me, ‘You know, I don’t think you belong in this family. I think they gave us the wrong baby.’ And she’d say it with a smile on her face. But, of course, people often pretend to joke when they say something serious.” Over the years our research team has repeatedly found that chronic emotional abuse and neglect can be just as devastating as physical abuse and sexual molestation. Sherry turned out to be a living example of these findings: Not being seen, not being known, and having nowhere to turn to feel safe is devastating at any age, but it is particularly destructive for young children, who are still trying to find their place in the world. Sherry had graduated from college, but she now worked in a joyless clerical job, lived alone with her cats, and had no close friends. When I asked her about men, she told me that her only “relationship” had been with a man who’d kidnapped her while she was on a college vacation in Florida. He’d held her captive and raped her repeatedly for five consecutive days. She remembered having been curled up, terrified and frozen for most of that time, until she realized she could try to get away. She escaped by simply walking out while he was in the bathroom. When she called her mother collect for help, her mother refused to take the call. Sherry finally managed to get home with assistance from a domestic violence shelter. Sherry told me that she’d started to pick at her skin because it gave her some relief from feeling numb. The physical sensations made her feel more alive but also deeply ashamed—she knew she was addicted to these actions but could not stop them. She’d consulted many mental health professionals before me and had been questioned repeatedly about her “suicidal behavior.” She’d also been subjected to involuntary hospitalization by a psychiatrist who refused to treat her unless she could promise that she would never pick at herself again. However, in my experience, patients who cut themselves or pick at their skin like Sherry, are seldom suicidal but are trying to make themselves feel better in the only way they know. This is a difficult concept for many people to understand. As I discussed in the previous chapter, the most common response to distress is to seek out people we like and trust to help us and give us the courage to go on. We may also calm down by engaging in a physical activity like biking or going to the gym. We start learning these ways of regulating our feelings from the first moment someone feeds us when we’re hungry, covers us when we’re cold, or rocks us when we’re hurt or scared. But if no one has ever looked at you with loving eyes or broken out in a smile when she sees you; if no one has rushed to help you (but instead said, “Stop crying, or I’ll give you something to cry about”), then you need to discover other ways of taking care of yourself. You are likely to experiment with anything—drugs, alcohol, binge eating, or cutting—that offers some kind of relief. While Sherry dutifully came to every appointment and answered my questions with great sincerity, I did not feel we were making the sort of vital connection that is necessary for therapy to work. Struck by how frozen and uptight she was, I suggested that she see Liz, a massage therapist I had worked with previously. During their first meeting Liz positioned Sherry on the massage table, then moved to the end of the table and gently held Sherry’s feet. Lying there with her eyes closed, Sherry suddenly yelled in a panic: “Where are you?” Somehow Sherry had lost track of Liz, even though Liz was right there, with her hands on Sherry’s feet. Sherry was one of the first patients who taught me about the extreme disconnection from the body that so many people with histories of trauma and neglect experience. I discovered that my professional training, with its focus on understanding and insight, had largely ignored the relevance of the living, breathing body, the foundation of our selves. Sherry knew that picking her skin was a destructive thing to do and that it was related to her mother’s neglect, but understanding the source of the impulse made no difference in helping her control it.LOSING YOUR BODY
Once I was alerted to this, I was amazed to discover how many of my patients told me they could not feel whole areas of their bodies. Sometimes I’d ask them to close their eyes and tell me what I had put into their outstretched hands. Whether it was a car key, a quarter, or a can opener, they often could not even guess what they were holding—their sensory perceptions simply weren’t working. I talked this over with my friend Alexander McFarlane in Australia, who had observed the same phenomenon. In his laboratory in Adelaide he had studied the question: How do we know without looking at it that we’re holding a car key? Recognizing an object in the palm of your hand requires sensing its shape, weight, temperature, texture, and position. Each of those distinct sensory experiences is transmitted to a different part of the brain, which then needs to integrate them into a single perception. McFarlane found that people with PTSD often have trouble putting the picture together. When our senses become muffled, we no longer feel fully alive. In an article called “What Is an Emotion?” (1884), William James, the father of American psychology, reported a striking case of “sensory insensibility” in a woman he interviewed: “I have... no human sensations,” she told him. “[I am] surrounded by all that can render life happy and agreeable, still to me the faculty of enjoyment and of feeling is wanting.... Each of my senses, each part of my proper self, is as it were separated from me and can no longer afford me any feeling; this impossibility seems to depend upon a void which I feel in the front of my head, and to be due to the diminution of the sensibility over the whole surface of my body, for it seems to me that I never actually reach the objects which I touch. All this would be a small matter enough, but for its frightful result, which is that of the impossibility of any other kind of feeling and of any sort of enjoyment, although I experience a need and desire of them that render my life an incomprehensible torture.” This response to trauma raises an important question: How can traumatized people learn to integrate ordinary sensory experiences so that they can live with the natural flow of feeling and feel secure and complete in their bodies?HOW DO WE KNOW WE’RE ALIVE?
Most early neuroimaging studies of traumatized people were like those we’ve seen in chapter 3; they focused on how subjects reacted to specific reminders of the trauma. Then, in 2004, my colleague Ruth Lanius, who scanned Stan and Ute Lawrence’s brains, posed a new question: What happens in the brains of trauma survivors when they are not thinking about the past? Her studies on the idling brain, the “default state network” (DSN), opened up a whole new chapter in understanding how trauma affects selfawareness, specifically sensory self-awareness. Dr. Lanius recruited a group of sixteen “normal” Canadians to lie in a brain scanner while thinking about nothing in particular. This is not easy for anyone to do—as long as we are awake, our brains are churning—but she asked them to focus their attention on their breathing and try to empty their minds as much as possible. She then repeated the same experiment with eighteen people who had histories of severe, chronic childhood abuse. What is your brain doing when you have nothing in particular on your mind? It turns out that you pay attention to yourself: The default state activates the brain areas that work together to create your sense of “self.” When Ruth looked at the scans of her normal subjects, she found activation of DSN regions that previous researchers had described. I like to call this the Mohawk of self-awareness, the midline structures of the brain, starting out right above our eyes, running through the center of the brain all the way to the back. All these midline structures are involved in our sense of self. The largest bright region at the back of the brain is the posterior cingulate, which gives us a physical sense of where we are - our internal GPS. It is strongly connected to the medial prefrontal cortex (MPFC), the watchtower I discussed in chapter 4. (This connection doesn't show up on the scan because the fMRI can't measure it.) It is also connected with brain areas that register sensations coming from the rest of the body: the insula, which relays messages from the viscera to the emotional centers; the parietal lobes, which integrate sensory information; and the anterior cingulate, which coordinates emotions and thinking. All of these areas contribute to consciousness. Locating the self. The Mohawk of self-awareness. Starting from the front of the brain (at right), this consists of: the orbital prefrontal cortex, the medial prefrontal cortex, the anterior cingulate, the posterior cingulate, and the insula. In individuals with histories of chronic trauma the same regions show sharply decreased activity, making it difficult to register internal states and assessing the personal relevance of incoming information. The contrast with the scans of the eighteen chronic PTSD patients with severe early-life trauma was startling. There was almost no activation of any of the self-sensing areas of the brain: The MPFC, the anterior cingulate, the parietal cortex, and the insula did not light up at all; the only area that showed a slight activation was the posterior cingulate, which is responsible for basic orientation in space. There could be only one explanation for such results: In response to the trauma itself, and in coping with the dread that persisted long afterward, these patients had learned to shut down the brain areas that transmit the visceral feelings and emotions that accompany and define terror. Yet in everyday life, those same brain areas are responsible for registering the entire range of emotions and sensations that form the foundation of our selfawareness, our sense of who we are. What we witnessed here was a tragic adaptation: In an effort to shut off terrifying sensations, they also deadened their capacity to feel fully alive. The disappearance of medial prefrontal activation could explain why so many traumatized people lose their sense of purpose and direction. I used to be surprised by how often my patients asked me for advice about the most ordinary things, and then by how rarely they followed it. Now I understood that their relationship with their own inner reality was impaired. How could they make decisions, or put any plan into action, if they couldn’t define what they wanted or, to be more precise, what the sensations in their bodies, the basis of all emotions, were trying to tell them? The lack of self-awareness in victims of chronic childhood trauma is sometimes so profound that they cannot recognize themselves in a mirror. Brain scans show that this is not the result of mere inattention: The structures in charge of self-recognition may be knocked out along with the structures related to self-experience. When Ruth Lanius showed me her study, a phrase from my classical high school education came back to me. The mathematician Archimedes, teaching about the lever, is supposed to have said: “Give me a place to stand and I will move the world.” Or, as the great twentieth-century body therapist Moshe Feldenkrais put it: “You can't do what you want till you know what you’re doing.” The implications are clear: to feel present you have to know where you are and be aware of what is going on with you. If the self-sensing system breaks down we need to find ways to reactivate it.THE SELF-SENSING SYSTEM
It was fascinating to see how much Sherry benefited from her massage therapy. She felt more relaxed and adventurous in her day-to-day life and she was also more relaxed and open with me. She became truly involved in her therapy and was genuinely curious about her behavior, thoughts, and feelings. She stopped picking at her skin, and when summer came she started to spend evenings sitting outside on her stoop, chatting with her neighbors. She even joined a church choir, a wonderful experience of group synchrony. It was at about this time that I met Antonio Damasio at a small think tank that Dan Schacter, the chair of the psychology department at Harvard, had organized. In a series of brilliant scientific articles and books Damasio clarified the relationship among body states, emotions, and survival. A neurologist who has treated hundreds of people with various forms of brain damage, he became fascinated with consciousness and with identifying the areas of the brain necessary for knowing what you feel. He has devoted his career to mapping out what is responsible for our experience of “self.” The Feeling of What Happens is, for me, his most important book, and reading it was a revelation.5 Damasio starts by pointing out the deep divide between our sense of self and the sensory life of our bodies. As he poetically explains, “Sometimes we use our minds not to discover facts, but to hide them.... One of the things the screen hides most effectively is the body, our own body, by which I mean the ins of it, its interiors. Like a veil thrown over the skin to secure its modesty, the screen partially removes from the mind the inner states of the body, those that constitute the flow of life as it wanders in the journey of each day.” He goes on to describe how this “screen” can work in our favor by enabling us to attend to pressing problems in the outside world. Yet it has a cost: "It tends to prevent us from sensing the possible origin and nature of what we call self." Building on the century-old work of William James, Damasio argues that the core of our self-awareness rests on the physical sensations that convey the inner states of the body: [P]rimordial feelings provide a direct experience of one’s own living body, wordless, unadorned, and connected to nothing but sheer existence. These primordial feelings reflect the current state of the body along varied dimensions,... along the scale that ranges from pleasure to pain, and they originate at the level of the brain stem rather than the cerebral cortex. All feelings of emotion are complex musical variations on primordial feelings. Our sensory world takes shape even before we are born. In the womb we feel amniotic fluid against our skin, we hear the faint sounds of rushing blood and a digestive tract at work, we pitch and roll with our mother’s movements. After birth, physical sensation defines our relationship to ourselves and to our surroundings. We start off being our wetness, hunger, satiation, and sleepiness. A cacophony of incomprehensible sounds and images presses in on our pristine nervous system. Even after we acquire consciousness and language, our bodily sensing system provides crucial feedback on our moment-to-moment condition. Its constant hum communicates changes in our viscera and in the muscles of our face, torso, and extremities that signal pain and comfort, as well as urges such as hunger and sexual arousal. What is taking place around us also affects our physical sensations. Seeing someone we recognize, hearing particular sounds—a piece of music, a siren—or sensing a shift in temperature all change our focus of attention and, without our being aware of it, prime our subsequent thoughts and actions. As we have seen, the job of the brain is to constantly monitor and evaluate what is going on within and around us. These evaluations are transmitted by chemical messages in the bloodstream and electrical messages in our nerves, causing subtle or dramatic changes throughout the body and brain. These shifts usually occur entirely without conscious input or awareness: The subcortical regions of the brain are astoundingly efficient in regulating our breathing, heartbeat, digestion, hormone secretion, and immune system. However, these systems can become overwhelmed if we are challenged by an ongoing threat, or even the perception of threat. This accounts for the wide array of physical problems researchers have documented in traumatized people. Yet our conscious self also plays a vital role in maintaining our inner equilibrium: We need to register and act on our physical sensations to keep our bodies safe. Realizing we’re cold compels us to put on a sweater; feeling hungry or spacey tells us our blood sugar is low and spurs us to get a snack; the pressure of a full bladder sends us to the bathroom. Damasio points out that all of the brain structures that register background feelings are located near areas that control basic housekeeping functions, such as breathing, appetite, elimination, and sleep/wake cycles: “This is because the consequences of having emotion and attention are entirely related to the fundamental business of managing life within the organism. It is not possible to manage life and maintain homeostatic balance without data on the current state of the organism’s body.”9 Damasio calls these housekeeping areas of the brain the “proto-self,” because they create the “wordless knowledge” that underlies our conscious sense of self.THE SELF UNDER THREAT
In 2000 Damasio and his colleagues published an article in the world’s foremost scientific publication, Science, which reported that reliving a strong negative emotion causes significant changes in the brain areas that receive nerve signals from the muscles, gut, and skin—areas that are crucial for regulating basic bodily functions. The team’s brain scans showed that recalling an emotional event from the past causes us to actually reexperience the visceral sensations felt during the original event. Each type of emotion produced a characteristic pattern, distinct from the others. For example, a particular part of the brain stem was “active in sadness and anger, but not in happiness or fear.”10 All of these brain regions are below the limbic system, to which emotions are traditionally assigned, yet we acknowledge their involvement every time we use one of the common expressions that link strong emotions with the body: “You make me sick”; “It made my skin crawl”; “I was all choked up”; “My heart sank”; “He makes me bristle.” The elementary self system in the brain stem and limbic system is massively activated when people are faced with the threat of annihilation, which results in an overwhelming sense of fear and terror accompanied by intense physiological arousal. To people who are reliving a trauma, nothing makes sense; they are trapped in a life-or-death situation, a state of paralyzing fear or blind rage. Mind and body are constantly aroused, as if they are in imminent danger. They startle in response to the slightest noises and are frustrated by small irritations. Their sleep is chronically disturbed, and food often loses its sensual pleasures. This in turn can trigger desperate attempts to shut those feelings down by freezing and dissociation. How do people regain control when their animal brains are stuck in a fight for survival? If what goes on deep inside our animal brains dictates how we feel, and if our body sensations are orchestrated by subcortical (subconscious) brain structures, how much control over them can we actually have?AGENCY: OWNING YOUR LIFE
“Agency” is the technical term for the feeling of being in charge of your life: knowing where you stand, knowing that you have a say in what happens to you, knowing that you have some ability to shape your circumstances. The veterans who put their fists through drywall at the VA were trying to assert their agency—to make something happen. But they ended up feeling even more out of control, and many of these onceconfident men were trapped in a cycle between frantic activity and immobility. Agency starts with what scientists call interoception, our awareness of our subtle sensory, body-based feelings: the greater that awareness, the greater our potential to control our lives. Knowing what we feel is the first step to knowing why we feel that way. If we are aware of the constant changes in our inner and outer environment, we can mobilize to manage them. But we can’t do this unless our watchtower, the MPFC, learns to observe what is going on inside us. This is why mindfulness practice, which strengthens the MPFC, is a cornerstone of recovery from trauma.12 After I saw the wonderful movie March of the Penguins, I found myself thinking about some of my patients. The penguins are stoic and endearing, and it’s tragic to learn how, from time immemorial, they have trudged seventy miles inland from the sea, endured indescribable hardships to reach their breeding grounds, lost numerous viable eggs to exposure, and then, almost starving, dragged themselves back to the ocean. If penguins had our frontal lobes, they would have used their little flippers to build igloos, devised a better division of labor, and reorganized their food supplies. Many of my patients have survived trauma through tremendous courage and persistence, only to get into the same kinds of trouble over and over again. Trauma has shut down their inner compass and robbed them of the imagination they need to create something better. The neuroscience of selfhood and agency validates the kinds of somatic therapies that my friends Peter Levine13 and Pat Ogden14 have developed. I’ll discuss these and other sensorimotor approaches in more detail in part V, but in essence their aim is threefold: 1: to draw out the sensory information that is blocked and frozen by trauma; 2: to help patients befriend (rather than suppress) the energies released by that inner experience; 3: to complete the self-preserving physical actions that were thwarted when they were trapped, restrained, or immobilized by terror. Our gut feelings signal what is safe, life sustaining, or threatening, even if we cannot quite explain why we feel a particular way. Our sensory interiority continuously sends us subtle messages about the needs of our organism. Gut feelings also help us to evaluate what is going on around us. They warn us that the guy who is approaching feels creepy, but they also convey that a room with western exposure surrounded by daylilies makes us feel serene. If you have a comfortable connection with your inner sensations—if you can trust them to give you accurate information—you will feel in charge of your body, your feelings, and your self. However, traumatized people chronically feel unsafe inside their bodies: The past is alive in the form of gnawing interior discomfort. Their bodies are constantly bombarded by visceral warning signs, and, in an attempt to control these processes, they often become expert at ignoring their gut feelings and in numbing awareness of what is played out inside. They learn to hide from their selves. The more people try to push away and ignore internal warning signs, the more likely they are to take over and leave them bewildered, confused, and ashamed. People who cannot comfortably notice what is going on inside become vulnerable to respond to any sensory shift either by shutting down or by going into a panic—they develop a fear of fear itself. We now know that panic symptoms are maintained largely because the individual develops a fear of the bodily sensations associated with panic attacks. The attack may be triggered by something he or she knows is irrational, but fear of the sensations keeps them escalating into a full-body emergency. “Scared stiff” and “frozen in fear” (collapsing and going numb) describe precisely what terror and trauma feel like. They are its visceral foundation. The experience of fear derives from primitive responses to threat where escape is thwarted in some way. People’s lives will be held hostage to fear until that visceral experience changes. The price for ignoring or distorting the body’s messages is being unable to detect what is truly dangerous or harmful for you and, just as bad, what is safe or nourishing. Self-regulation depends on having a friendly relationship with your body. Without it you have to rely on external regulation—from medication, drugs like alcohol, constant reassurance, or compulsive compliance with the wishes of others. Many of my patients respond to stress not by noticing and naming it but by developing migraine headaches or asthma attacks.15 Sandy, a middle-aged visiting nurse, told me she’d felt terrified and lonely as a child, unseen by her alcoholic parents. She dealt with this by becoming deferential to everybody she depended on (including me, her therapist). Whenever her husband made an insensitive remark, she would come down with an asthma attack. By the time she noticed that she couldn’t breathe, it was too late for an inhaler to be effective, and she had to be taken to the emergency room. Suppressing our inner cries for help does not stop our stress hormones from mobilizing the body. Even though Sandy had learned to ignore her relationship problems and block out her physical distress signals, they showed up in symptoms that demanded her attention. Her therapy focused on identifying the link between her physical sensations and her emotions, and I also encouraged her to enroll in a kickboxing program. She had no emergency room visits during the three years she was my patient. Somatic symptoms for which no clear physical basis can be found are ubiquitous in traumatized children and adults. They can include chronic back and neck pain, fibromyalgia, migraines, digestive problems, spastic colon/irritable bowel syndrome, chronic fatigue, and some forms of asthma.16 Traumatized children have fifty times the rate of asthma as their nontraumatized peers. Studies have shown that many children and adults with fatal asthma attacks were not aware of having breathing problems before the attacks.ALEXITHYMIA: NO WORDS FOR FEELINGS
I had a widowed aunt with a painful trauma history who became an honorary grandmother to our children. She came on frequent visits that were marked by much doing—making curtains, rearranging kitchen shelves, sewing children’s clothes—and very little talking. She was always eager to please, but it was difficult to figure out what she enjoyed. After several days of exchanging pleasantries, conversation would come to a halt, and I’d have to work hard to fill the long silences. On the last day of her visits I’d drive her to the airport, where she’d give me a stiff good-bye hug while tears streamed down her face. Without a trace of irony she’d then complain that the cold wind at Logan International Airport made her eyes water. Her body felt the sadness that her mind couldn’t register—she was leaving our young family, her closest living relatives. Psychiatrists call this phenomenon alexithymia—Greek for not having words for feelings. Many traumatized children and adults simply cannot describe what they are feeling because they cannot identify what their physical sensations mean. They may look furious but deny that they are angry; they may appear terrified but say that they are fine. Not being able to discern what is going on inside their bodies causes them to be out of touch with their needs, and they have trouble taking care of themselves, whether it involves eating the right amount at the right time or getting the sleep they need. Like my aunt, alexithymics substitute the language of action for that of emotion. When asked, “How would you feel if you saw a truck coming at you at eighty miles per hour?” most people would say, “I’d be terrified” or “I’d be frozen with fear.” An alexithymic might reply, “How would I feel? I don’t know. . . . I’d get out of the way.”18 They tend to register emotions as physical problems rather than as signals that something deserves their attention. Instead of feeling angry or sad, they experience muscle pain, bowel irregularities, or other symptoms for which no cause can be found. About three quarters of patients with anorexia nervosa, and more than half of all patients with bulimia, are bewildered by their emotional feelings and have great difficulty describing them.19 When researchers showed pictures of angry or distressed faces to people with alexithymia, they could not figure out what those people were feeling. One of the first people who taught me about alexithymia was the psychiatrist Henry Krystal, who worked with more than a thousand Holocaust survivors in his effort to understand massive psychic trauma. Krystal, himself a concentration camp survivor, found that many of his patients were professionally successful, but their intimate relationships were bleak and distant. Suppressing their feelings had made it possible to attend to the business of the world, but at a price. They learned to shut down their once overwhelming emotions, and, as a result, they no longer recognized what they were feeling. Few of them had any interest in therapy. Paul Frewen at the University of Western Ontario did a series of brain scans of people with PTSD who suffered from alexithymia. One of the participants told him: “I don’t know what I feel, it’s like my head and body aren’t connected. I’m living in a tunnel, a fog, no matter what happens it’s the same reaction—numbness, nothing. Having a bubble bath and being burned or raped is the same feeling. My brain doesn’t feel.” Frewen and his colleague Ruth Lanius found that the more people were out of touch with their feelings, the less activity they had in the self-sensing areas of the brain. Because traumatized people often have trouble sensing what is going on in their bodies, they lack a nuanced response to frustration. They either react to stress by becoming “spaced out” or with excessive anger. Whatever their response, they often can’t tell what is upsetting them. This failure to be in touch with their bodies contributes to their well-documented lack of selfprotection and high rates of revictimization23 and also to their remarkable difficulties feeling pleasure, sensuality, and having a sense of meaning. People with alexithymia can get better only by learning to recognize the relationship between their physical sensations and their emotions, much as colorblind people can only enter the world of color by learning to distinguish and appreciate shades of gray. Like my aunt and Henry Krystal’s patients, they usually are reluctant to do that: Most seem to have made an unconscious decision that it is better to keep visiting doctors and treating ailments that don’t heal than to do the painful work of facing the demons of the past.DEPERSONALIZATION
One step further down on the ladder to self-oblivion is depersonalization— losing your sense of yourself. Ute’s brain scan in chapter 4 is, in its very blankness, a vivid illustration of depersonalization. Depersonalization is common during traumatic experiences. I was once mugged late at night in a park close to my home and, floating above the scene, saw myself lying in the snow with a small head wound, surrounded by three knife-wielding teenagers. I dissociated the pain of their stab wounds on my hands and did not feel the slightest fear as I calmly negotiated for the return of my emptied wallet. I did not develop PTSD, partly, I think, because I was intensely curious about having an experience I had studied so closely in others, and partly because I had the delusion that I would be able make a drawing of my muggers to show to the police. Of course, they were never caught, but my fantasy of revenge must have given me a satisfying sense of agency. Traumatized people are not so fortunate and feel separated from their bodies. One particularly good description of depersonalization comes from the German psychoanalyst Paul Schilder, writing in Berlin in 1928:24 “To the depersonalized individual the world appears strange, peculiar, foreign, dream-like. Objects appear at times strangely diminished in size, at times flat. Sounds appear to come from a distance. . . . The emotions likewise undergo marked alteration. Patients complain that they are capable of experiencing neither pain nor pleasure. . . . They have become strangers to themselves.” I was fascinated to learn that a group of neuroscientists at the University of Geneva25 had induced similar out-of-body experiences by delivering mild electric current to a specific spot in the brain, the temporal parietal junction. In one patient this produced a sensation that she was hanging from the ceiling, looking down at her body; in another it induced an eerie feeling that someone was standing behind her. This research confirms what our patients tell us: that the self can be detached from the body and live a phantom existence on its own. Similarly, Lanius and Frewen, as well as a group of researchers at the University of Groningen in the Netherlands, did brain scans on people who dissociated their terror and found that the fear centers of the brain simply shut down as they recalled the event.BEFRIENDING THE BODY
Trauma victims cannot recover until they become familiar with and befriend the sensations in their bodies. Being frightened means that you live in a body that is always on guard. Angry people live in angry bodies. The bodies of child-abuse victims are tense and defensive until they find a way to relax and feel safe. In order to change, people need to become aware of their sensations and the way that their bodies interact with the world around them. Physical self-awareness is the first step in releasing the tyranny of the past. How can people open up to and explore their internal world of sensations and emotions? In my practice I begin the process by helping my patients to first notice and then describe the feelings in their bodies—not emotions such as anger or anxiety or fear but the physical sensations beneath the emotions: pressure, heat, muscular tension, tingling, caving in, feeling hollow, and so on. I also work on identifying the sensations associated with relaxation or pleasure. I help them become aware of their breath, their gestures and movements. I ask them to pay attention to subtle shifts in their bodies, such as tightness in their chests or gnawing in their bellies, when they talk about negative events that they claim did not bother them. Noticing sensations for the first time can be quite distressing, and it may precipitate flashbacks in which people curl up or assume defensive postures. These are somatic reenactments of the undigested trauma and most likely represent the postures they assumed when the trauma occurred. Images and physical sensations may deluge patients at this point, and the therapist must be familiar with ways to stem torrents of sensation and emotion to prevent them from becoming retraumatized by accessing the past. (Schoolteachers, nurses, and police officers are often very skilled at soothing terror reactions because many of them are confronted almost daily with out-of-control or painfully disorganized people.) All too often, however, drugs such as Abilify, Zyprexa, and Seroquel, are prescribed instead of teaching people the skills to deal with such distressing physical reactions. Of course, medications only blunt sensations and do nothing to resolve them or transform them from toxic agents into allies. The most natural way for human beings to calm themselves when they are upset is by clinging to another person. This means that patients who have been physically or sexually violated face a dilemma: They desperately crave touch while simultaneously being terrified of body contact. The mind needs to be reeducated to feel physical sensations, and the body needs to be helped to tolerate and enjoy the comforts of touch. Individuals who lack emotional awareness are able, with practice, to connect their physical sensations to psychological events. Then they can slowly reconnect with themselves.CONNECTING WITH YOURSELF, CONNECTING WITH OTHERS
I’ll end this chapter with one final study that demonstrates the cost of losing your body. After Ruth Lanius and her group scanned the idling brain, they focused on another question from everyday life: What happens in chronically traumatized people when they make face-to-face contact? Many patients who come to my office are unable to make eye contact. I immediately know how distressed they are by their difficulty meeting my gaze. It always turns out that they feel disgusting and that they can’t stand having me see how despicable they are. It never occurred to me that these intense feelings of shame would be reflected in abnormal brain activation. Ruth Lanius once again showed that mind and brain are indistinguishable— what happens in one is registered in the other. Ruth bought an expensive device that presents a video character to a person lying in a scanner. (In this case, the cartoon resembled a kindly Richard Gere.) The figure can approach either head on (looking directly at the person) or at a forty-five-degree angle with an averted gaze. This made it possible to compare the effects of direct eye contact on brain activation with those of an averted gaze. The most striking difference between normal controls and survivors of chronic trauma was in activation of the prefrontal cortex in response to a direct eye gaze. The prefrontal cortex (PFC) normally helps us to assess the person coming toward us, and our mirror neurons help to pick up his intentions. However, the subjects with PTSD did not activate any part of their frontal lobe, which means they could not muster any curiosity about the stranger. They just reacted with intense activation deep inside their emotional brains, in the primitive areas known as the Periaqueductal Gray, which generates startle, hypervigilance, cowering, and other self-protective behaviors. There was no activation of any part of the brain involved in social engagement. In response to being looked at they simply went into survival mode. What does this mean for their ability to make friends and get along with others? What does it mean for their therapy? Can people with PTSD trust a therapist with their deepest fears? To have genuine relationships you have to be able to experience others as separate individuals, each with his or her particular motivations and intentions. While you need to be able to stand up for yourself, you also need to recognize that other people have their own agendas. Trauma can make all that hazy and gray. Reference: Chapter 6 from the book by Bessel van red Kolk titled "The Body Keeps the score (2022)"
Wednesday, June 1, 2022
Losing your body, losing your self (from the book: Body Keeps The Score)
Creating environment and kernel for sentence_transformers
Tags: Machine Learning, Natural Language Processing, Python, TechnologyENV.YML File
name: sentence_transformers channels: - conda-forge dependencies: - python=3.9 - pip - pandas - openpyxl - ipykernel - jupyter - tensorflow - pip: - sentence-transformersInstallation Logs
(base) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter\sentence_encoding_transformers>conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.11.0 latest version: 4.13.0 Please update conda by running $ conda update -n base conda Downloading and Extracting Packages qt-main-5.15.2 | 67.6 MB | #### | 100% nltk-3.6.7 | 1.1 MB | #### | 100% gst-plugins-base-1.1 | 2.2 MB | #### | 100% pip-22.1.2 | 1.5 MB | #### | 100% libclang-13.0.1 | 23.7 MB | #### | 100% gstreamer-1.18.5 | 2.1 MB | #### | 100% lz4-c-1.9.3 | 135 KB | #### | 100% toml-0.10.2 | 18 KB | #### | 100% pyqt-5.15.4 | 4.7 MB | #### | 100% libogg-1.3.4 | 34 KB | #### | 100% sip-6.5.1 | 417 KB | #### | 100% pcre-8.45 | 518 KB | #### | 100% zstd-1.5.2 | 1008 KB | #### | 100% gettext-0.19.8.1 | 4.9 MB | #### | 100% pyqt5-sip-12.9.0 | 72 KB | #### | 100% sentence-transformer | 82 KB | #### | 100% libglib-2.70.2 | 3.1 MB | #### | 100% libvorbis-1.3.7 | 267 KB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: / Enabling notebook extension jupyter-js-widgets/extension... - Validating: ok done # # To activate this environment, use # # $ conda activate sentence_transformers # # To deactivate an active environment, use # # $ conda deactivate (base) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter\sentence_encoding_transformers>conda activate sentence_transformers (sentence_transformers) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter\sentence_encoding_transformers>python -m ipykernel install --user --name sentence_transformers Installed kernelspec sentence_transformers in C:\Users\Ashish Jain\AppData\Roaming\jupyter\kernels\sentence_transformers (sentence_transformers) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter\sentence_encoding_transformers>
DistilBERT based Sentence Encoding Transformer
from sentence_transformers import SentenceTransformer sbert_model = SentenceTransformer('bert-base-nli-mean-tokens') # timeout Traceback (most recent call last) # timeout: The read operation timed out # ConnectionError: HTTPSConnectionPool(host='cdn-lfs.huggingface.co', port=443): Read timed out. bert-base-nli-mean-tokens model size is roughly: 450 MB. model = SentenceTransformer('distilbert-base-nli-mean-tokens') Downloading: 100% 265M/265M [55:31<00:00, 75.6kB/s] Downloading: 100% 53.0/53.0 [00:00<00:00, 1.41kB/s] Downloading: 100% 112/112 [00:00<00:00, 2.99kB/s] Downloading: 100% 466k/466k [00:11<00:00, 80.0kB/s] Downloading: 100% 450/450 [00:00<00:00, 14.5kB/s] Downloading: 100% 232k/232k [00:03<00:00, 76.7kB/s] query = "I had pizza and pasta" query_vec = model.encode([query])[0] len(query_vec) 768 input_sentence_1 = "In recent years, a lot of hype has developed around the promise of neural networks and their ability to classify and identify input data, and more recently the ability of certain network architectures to generate original content. Companies large and small are using them for everything from image captioning and self-driving car navigation to identifying solar panels from satellite images and recognizing faces in security camera videos. And luckily for us, many NLP applications of neural nets exist as well. While deep neural networks have inspired a lot of hype and hyperbole, our robot overlords are probably further off than any clickbait cares to admit. Neural networks are, however, quite powerful tools, and you can easily use them in an NLP chatbot pipeline to classify input text, summarize documents, and even generate novel works. This chapter is intended as a primer for those with no experience in neural networks. We don’t cover anything specific to NLP in this chapter, but gaining a basic understanding of what is going on under the hood in a neural network is important for the upcoming chapters. If you’re familiar with the basics of a neural network, you can rest easy in skipping ahead to the next chapter, where you dive back into processing text with the various flavors of neural nets. Although the mathematics of the underlying algorithm, backpropagation, are outside this book’s scope, a high-level grasp of its basic functionality will help you understand language and the patterns hidden within. As the availability of processing power and memory has exploded over the course of the decade, an old technology has come into its own again. First proposed in the 1950s by Frank Rosenblatt, the perceptron1 offered a novel algorithm for finding patterns in data. The basic concept lies in a rough mimicry of the operation of a living neuron cell. As electrical signals flow into the cell through the dendrites (see figure 5.1) into the nucleus, an electric charge begins to build up. When the cell reaches a certain level of charge, it fires, sending an electrical signal out through the axon. However, the dendrites aren’t all created equal. The cell is more “sensitive” to signals through certain dendrites than others, so it takes less of a signal in those paths to fire the axon. The biology that controls these relationships is most certainly beyond the scope of this book, but the key concept to notice here is the way the cell weights incoming signals when deciding when to fire. The neuron will dynamically change those weights in the decision making process over the course of its life. You are going to mimic that process." print("Char count in input_sentence_1:", len(input_sentence_1)) print("Word Count in input_sentence_1:", len(input_sentence_1.split(" "))) Char count in input_sentence_1: 2658 Word Count in input_sentence_1: 442 input_sentence_1_vec = model.encode([input_sentence_1])[0] type(input_sentence_1_vec) numpy.ndarray input_sentence_1_vec.shape (768,)Tags: Machine Learning, Natural Language Processing, Python, Technology
Tuesday, May 31, 2022
RuntimeError for input token sequence longer than 512 tokens for BERT
import transformers as ppb import torch import numpy as np print(ppb.__version__) 4.19.2 input_sentence_1 = "In recent years, a lot of hype has developed around the promise of neural networks and their ability to classify and identify input data, and more recently the ability of certain network architectures to generate original content. Companies large and small are using them for everything from image captioning and self-driving car navigation to identifying solar panels from satellite images and recognizing faces in security camera videos. And luckily for us, many NLP applications of neural nets exist as well. While deep neural networks have inspired a lot of hype and hyperbole, our robot overlords are probably further off than any clickbait cares to admit. Neural networks are, however, quite powerful tools, and you can easily use them in an NLP chatbot pipeline to classify input text, summarize documents, and even generate novel works. This chapter is intended as a primer for those with no experience in neural networks. We don’t cover anything specific to NLP in this chapter, but gaining a basic understanding of what is going on under the hood in a neural network is important for the upcoming chapters. If you’re familiar with the basics of a neural network, you can rest easy in skipping ahead to the next chapter, where you dive back into processing text with the various flavors of neural nets. Although the mathematics of the underlying algorithm, backpropagation, are outside this book’s scope, a high-level grasp of its basic functionality will help you understand language and the patterns hidden within. As the availability of processing power and memory has exploded over the course of the decade, an old technology has come into its own again. First proposed in the 1950s by Frank Rosenblatt, the perceptron1 offered a novel algorithm for finding patterns in data. The basic concept lies in a rough mimicry of the operation of a living neuron cell. As electrical signals flow into the cell through the dendrites (see figure 5.1) into the nucleus, an electric charge begins to build up. When the cell reaches a certain level of charge, it fires, sending an electrical signal out through the axon. However, the dendrites aren’t all created equal. The cell is more “sensitive” to signals through certain dendrites than others, so it takes less of a signal in those paths to fire the axon." print(input_sentence_1) print("Char count", len(input_sentence_1)) print("Word Count:", len(input_sentence_1.split(" "))) In recent years, a lot of hype has developed around the promise of neural networks and their ability to classify and identify input data, and more recently the ability of certain network architectures to generate original content. Companies large and small are using them for everything from image captioning and self-driving car navigation to identifying solar panels from satellite images and recognizing faces in security camera videos. And luckily for us, many NLP applications of neural nets exist as well. While deep neural networks have inspired a lot of hype and hyperbole, our robot overlords are probably further off than any clickbait cares to admit. Neural networks are, however, quite powerful tools, and you can easily use them in an NLP chatbot pipeline to classify input text, summarize documents, and even generate novel works. This chapter is intended as a primer for those with no experience in neural networks. We don’t cover anything specific to NLP in this chapter, but gaining a basic understanding of what is going on under the hood in a neural network is important for the upcoming chapters. If you’re familiar with the basics of a neural network, you can rest easy in skipping ahead to the next chapter, where you dive back into processing text with the various flavors of neural nets. Although the mathematics of the underlying algorithm, backpropagation, are outside this book’s scope, a high-level grasp of its basic functionality will help you understand language and the patterns hidden within. As the availability of processing power and memory has exploded over the course of the decade, an old technology has come into its own again. First proposed in the 1950s by Frank Rosenblatt, the perceptron1 offered a novel algorithm for finding patterns in data. The basic concept lies in a rough mimicry of the operation of a living neuron cell. As electrical signals flow into the cell through the dendrites (see figure 5.1) into the nucleus, an electric charge begins to build up. When the cell reaches a certain level of charge, it fires, sending an electrical signal out through the axon. However, the dendrites aren’t all created equal. The cell is more “sensitive” to signals through certain dendrites than others, so it takes less of a signal in those paths to fire the axon. Char count 2309 Word Count: 382 input_sentence_2 = "The biology that controls these relationships is most certainly beyond the scope of this book, but the key concept to notice here is the way the cell weights incoming signals when deciding when to fire. The neuron will dynamically change those weights in the decision making process over the course of its life. You are going to mimic that process. Rosenblatt’s original project was to teach a machine to recognize images. The original perceptron was a conglomeration of photo-receptors and potentiometers, not a computer in the current sense. But implementation specifics aside, Rosenblatt’s concept was to take the features of an image and assign a weight, a measure of importance, to each one. The features of the input image were each a small subsection of the image. A grid of photo-receptors would be exposed to the image. Each receptor would see one small piece of the image. The brightness of the image that a particular photoreceptor could see would determine the strength of the signal that it would send to the associated “dendrite.” Each dendrite had an associated weight in the form of a potentiometer. Once enough signal came in, it would pass the signal into the main body of the “nucleus” of the “cell.” Once enough of those signals from all the potentiometers passed a certain threshold, the perceptron would fire down its axon, indicating a positive match on the image it was presented with. If it didn’t fire for a given image, that was a negative classification match. Think “hot dog, not hot dog” or “iris setosa, not iris setosa.” So far there has been a lot of hand waving about biology and electric current and photo-receptors. Let’s pause for a second and peel out the most important parts of this concept. Basically, you’d like to take an example from a dataset, show it to an algorithm, and have the algorithm say yes or no. That’s all you’re doing so far. The first piece you need is a way to determine the features of the sample. Choosing appropriate features turns out to be a surprisingly challenging part of machine learning. In “normal” machine learning problems, like predicting home prices, your features might be square footage, last sold price, and ZIP code. Or perhaps you’d like to predict the species of a certain flower using the Iris dataset.2 In that case your features would be petal length, petal width, sepal length, and sepal width. In Rosenblatt’s experiment, the features were the intensity values of each pixel (subsections of the image), one pixel per photo receptor." print(input_sentence_2) print("Char count", len(input_sentence_2)) print("Word Count:", len(input_sentence_2.split(" "))) The biology that controls these relationships is most certainly beyond the scope of this book, but the key concept to notice here is the way the cell weights incoming signals when deciding when to fire. The neuron will dynamically change those weights in the decision making process over the course of its life. You are going to mimic that process. Rosenblatt’s original project was to teach a machine to recognize images. The original perceptron was a conglomeration of photo-receptors and potentiometers, not a computer in the current sense. But implementation specifics aside, Rosenblatt’s concept was to take the features of an image and assign a weight, a measure of importance, to each one. The features of the input image were each a small subsection of the image. A grid of photo-receptors would be exposed to the image. Each receptor would see one small piece of the image. The brightness of the image that a particular photoreceptor could see would determine the strength of the signal that it would send to the associated “dendrite.” Each dendrite had an associated weight in the form of a potentiometer. Once enough signal came in, it would pass the signal into the main body of the “nucleus” of the “cell.” Once enough of those signals from all the potentiometers passed a certain threshold, the perceptron would fire down its axon, indicating a positive match on the image it was presented with. If it didn’t fire for a given image, that was a negative classification match. Think “hot dog, not hot dog” or “iris setosa, not iris setosa.” So far there has been a lot of hand waving about biology and electric current and photo-receptors. Let’s pause for a second and peel out the most important parts of this concept. Basically, you’d like to take an example from a dataset, show it to an algorithm, and have the algorithm say yes or no. That’s all you’re doing so far. The first piece you need is a way to determine the features of the sample. Choosing appropriate features turns out to be a surprisingly challenging part of machine learning. In “normal” machine learning problems, like predicting home prices, your features might be square footage, last sold price, and ZIP code. Or perhaps you’d like to predict the species of a certain flower using the Iris dataset.2 In that case your features would be petal length, petal width, sepal length, and sepal width. In Rosenblatt’s experiment, the features were the intensity values of each pixel (subsections of the image), one pixel per photo receptor. Char count 2518 Word Count: 426 model_class, tokenizer_class, pretrained_weights = (ppb.BertModel, ppb.BertTokenizer, 'bert-base-uncased') tokenizer = tokenizer_class.from_pretrained(pretrained_weights) model = model_class.from_pretrained(pretrained_weights) Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: ['cls.seq_relationship.weight', 'cls.predictions.decoder.weight', 'cls.predictions.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.bias', 'cls.seq_relationship.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.weight'] - This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). - This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). def get_embedding(in_list): tokenized = [tokenizer.encode(x, add_special_tokens=True) for x in in_list] max_len = 0 for i in tokenized: if len(i) > max_len: max_len = len(i) padded = np.array([i + [0]*(max_len-len(i)) for i in tokenized]) attention_mask = np.where(padded != 0, 1, 0) input_ids = torch.LongTensor(padded) attention_mask = torch.tensor(attention_mask) with torch.no_grad(): last_hidden_states = model(input_ids = input_ids, attention_mask = attention_mask) features = last_hidden_states[0][:,0,:].numpy() return features string_embeddings = get_embedding([input_sentence_1, input_sentence_2]) Token indices sequence length is longer than the specified maximum sequence length for this model (560 > 512). Running this sequence through the model will result in indexing errors --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) Input In [12], in <cell line: 1>() ----> 1 string_embeddings = get_embedding([input_sentence_1, input_sentence_2]) Input In [11], in get_embedding(in_list) 14 attention_mask = torch.tensor(attention_mask) 16 with torch.no_grad(): ---> 17 last_hidden_states = model(input_ids = input_ids, attention_mask = attention_mask) 19 features = last_hidden_states[0][:,0,:].numpy() 20 return features File E:\programfiles\Anaconda3\envs\transformers\lib\site-packages\torch\nn\modules\module.py:1102, in Module._call_impl(self, *input, **kwargs) 1098 # If we don't have any hooks, we want to skip the rest of the logic in 1099 # this function, and just call forward. 1100 if not (self._backward_hooks or self._forward_hooks or self._forward_pre_hooks or _global_backward_hooks 1101 or _global_forward_hooks or _global_forward_pre_hooks): -> 1102 return forward_call(*input, **kwargs) 1103 # Do not call functions when jit is used 1104 full_backward_hooks, non_full_backward_hooks = [], [] File E:\programfiles\Anaconda3\envs\transformers\lib\site-packages\transformers\models\bert\modeling_bert.py:983, in BertModel.forward(self, input_ids, attention_mask, token_type_ids, position_ids, head_mask, inputs_embeds, encoder_hidden_states, encoder_attention_mask, past_key_values, use_cache, output_attentions, output_hidden_states, return_dict) 981 if hasattr(self.embeddings, "token_type_ids"): 982 buffered_token_type_ids = self.embeddings.token_type_ids[:, :seq_length] --> 983 buffered_token_type_ids_expanded = buffered_token_type_ids.expand(batch_size, seq_length) 984 token_type_ids = buffered_token_type_ids_expanded 985 else: RuntimeError: The expanded size of the tensor (560) must match the existing size (512) at non-singleton dimension 1. Target sizes: [2, 560]. Tensor sizes: [1, 512]Tags: Machine Learning, Natural Language Processing, Python, Technology
Maximum input length test for BertTokenizer
import transformers as ppb import torch import numpy as np print(ppb.__version__) 4.19.2 input_sentence_1 = "In recent years, a lot of hype has developed around the promise of neural networks and their ability to classify and identify input data, and more recently the ability of certain network architectures to generate original content. Companies large and small are using them for everything from image captioning and self-driving car navigation to identifying solar panels from satellite images and recognizing faces in security camera videos. And luckily for us, many NLP applications of neural nets exist as well. While deep neural networks have inspired a lot of hype and hyperbole, our robot overlords are probably further off than any clickbait cares to admit. Neural networks are, however, quite powerful tools, and you can easily use them in an NLP chatbot pipeline to classify input text, summarize documents, and even generate novel works. This chapter is intended as a primer for those with no experience in neural networks. We don’t cover anything specific to NLP in this chapter, but gaining a basic understanding of what is going on under the hood in a neural network is important for the upcoming chapters. If you’re familiar with the basics of a neural network, you can rest easy in skipping ahead to the next chapter, where you dive back into processing text with the various flavors of neural nets. Although the mathematics of the underlying algorithm, backpropagation, are outside this book’s scope, a high-level grasp of its basic functionality will help you understand language and the patterns hidden within. As the availability of processing power and memory has exploded over the course of the decade, an old technology has come into its own again. First proposed in the 1950s by Frank Rosenblatt, the perceptron1 offered a novel algorithm for finding patterns in data. The basic concept lies in a rough mimicry of the operation of a living neuron cell. As electrical signals flow into the cell through the dendrites (see figure 5.1) into the nucleus, an electric charge begins to build up. When the cell reaches a certain level of charge, it fires, sending an electrical signal out through the axon. However, the dendrites aren’t all created equal. The cell is more “sensitive” to signals through certain dendrites than others, so it takes less of a signal in those paths to fire the axon. The biology that controls these relationships is most certainly beyond the scope of this book, but the key concept to notice here is the way the cell weights incoming signals when deciding when to fire. The neuron will dynamically change those weights in the decision making process over the course of its life. You are going to mimic that process." print(input_sentence_1) print("Char count", len(input_sentence_1)) print("Word Count:", len(input_sentence_1.split(" "))) In recent years, a lot of hype has developed around the promise of neural networks and their ability to classify and identify input data, and more recently the ability of certain network architectures to generate original content. Companies large and small are using them for everything from image captioning and self-driving car navigation to identifying solar panels from satellite images and recognizing faces in security camera videos. And luckily for us, many NLP applications of neural nets exist as well. While deep neural networks have inspired a lot of hype and hyperbole, our robot overlords are probably further off than any clickbait cares to admit. Neural networks are, however, quite powerful tools, and you can easily use them in an NLP chatbot pipeline to classify input text, summarize documents, and even generate novel works. This chapter is intended as a primer for those with no experience in neural networks. We don’t cover anything specific to NLP in this chapter, but gaining a basic understanding of what is going on under the hood in a neural network is important for the upcoming chapters. If you’re familiar with the basics of a neural network, you can rest easy in skipping ahead to the next chapter, where you dive back into processing text with the various flavors of neural nets. Although the mathematics of the underlying algorithm, backpropagation, are outside this book’s scope, a high-level grasp of its basic functionality will help you understand language and the patterns hidden within. As the availability of processing power and memory has exploded over the course of the decade, an old technology has come into its own again. First proposed in the 1950s by Frank Rosenblatt, the perceptron1 offered a novel algorithm for finding patterns in data. The basic concept lies in a rough mimicry of the operation of a living neuron cell. As electrical signals flow into the cell through the dendrites (see figure 5.1) into the nucleus, an electric charge begins to build up. When the cell reaches a certain level of charge, it fires, sending an electrical signal out through the axon. However, the dendrites aren’t all created equal. The cell is more “sensitive” to signals through certain dendrites than others, so it takes less of a signal in those paths to fire the axon. The biology that controls these relationships is most certainly beyond the scope of this book, but the key concept to notice here is the way the cell weights incoming signals when deciding when to fire. The neuron will dynamically change those weights in the decision making process over the course of its life. You are going to mimic that process. Char count 2658 Word Count: 442 model_class, tokenizer_class, pretrained_weights = (ppb.BertModel, ppb.BertTokenizer, 'bert-base-uncased') tokenizer = tokenizer_class.from_pretrained(pretrained_weights) model = model_class.from_pretrained(pretrained_weights) Some weights of the model checkpoint at bert-base-uncased were not used when initializing BertModel: [ 'cls.predictions.transform.LayerNorm.bias', 'cls.seq_relationship.bias', 'cls.seq_relationship.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.bias', 'cls.predictions.decoder.weight', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.transform.dense.weight' ] - This IS expected if you are initializing BertModel from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). - This IS NOT expected if you are initializing BertModel from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). tokenized = tokenizer.encode(input_sentence_1, add_special_tokens=True) Token indices sequence length is longer than the specified maximum sequence length for this model (543 > 512). Running this sequence through the model will result in indexing errors print("First ten tokens:", tokenized[:10]) print("Number of tokens:", len(tokenized)) First ten tokens: [101, 1999, 3522, 2086, 1010, 1037, 2843, 1997, 1044, 18863] Number of tokens: 543Tags: Technology,Machine Learning,Natural Language Processing,Python,
Alternate env.yml file for installing Python Package 'transformers' for BERT
Tags: Technology,Artificial Intelligence,Machine Learning,Natural Language Processing,Python,ENV.YML FILE:
name: transformers channels: - conda-forge dependencies: - python=3.9 - pip - pandas - openpyxl - ipykernel - jupyter - tensorflow - pip: - transformers (base) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter>conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.11.0 latest version: 4.12.0 Please update conda by running $ conda update -n base conda Downloading and Extracting Packages pyrsistent-0.18.1 | 85 KB | #### | 100% filelock-3.7.0 | 12 KB | #### | 100% tensorboard-plugin-w | 668 KB | #### | 100% aiohttp-3.8.1 | 545 KB | #### | 100% libcblas-3.9.0 | 4.5 MB | #### | 100% ipywidgets-7.7.0 | 103 KB | #### | 100% astunparse-1.6.3 | 15 KB | #### | 100% click-8.1.3 | 146 KB | #### | 100% bleach-5.0.0 | 123 KB | #### | 100% abseil-cpp-20210324. | 2.1 MB | #### | 100% requests-oauthlib-1. | 22 KB | #### | 100% tensorflow-2.6.0 | 4 KB | #### | 100% cryptography-37.0.2 | 1.1 MB | #### | 100% soupsieve-2.3.1 | 33 KB | #### | 100% tokenizers-0.12.1 | 3.1 MB | #### | 100% google-auth-oauthlib | 19 KB | #### | 100% libblas-3.9.0 | 4.5 MB | #### | 100% libprotobuf-3.14.0 | 2.3 MB | #### | 100% nbformat-5.4.0 | 104 KB | #### | 100% zeromq-4.3.4 | 8.9 MB | #### | 100% pywinpty-2.0.5 | 224 KB | #### | 100% attrs-21.4.0 | 49 KB | #### | 100% entrypoints-0.4 | 9 KB | #### | 100% libssh2-1.10.0 | 227 KB | #### | 100% markdown-3.3.7 | 67 KB | #### | 100% cachetools-4.2.4 | 12 KB | #### | 100% hdf5-1.12.1 | 23.0 MB | #### | 100% idna-3.3 | 55 KB | #### | 100% argon2-cffi-21.3.0 | 15 KB | #### | 100% mkl-2021.4.0 | 181.7 MB | #### | 100% wrapt-1.14.1 | 49 KB | #### | 100% nest-asyncio-1.5.5 | 9 KB | #### | 100% prompt-toolkit-3.0.2 | 252 KB | #### | 100% python-fastjsonschem | 243 KB | #### | 100% ca-certificates-2022 | 180 KB | #### | 100% libzlib-1.2.12 | 67 KB | #### | 100% beautifulsoup4-4.11. | 96 KB | #### | 100% openssl-1.1.1o | 5.7 MB | #### | 100% aiosignal-1.2.0 | 12 KB | #### | 100% jupyterlab_pygments- | 17 KB | #### | 100% jupyter_client-7.3.1 | 90 KB | #### | 100% qtconsole-base-5.3.0 | 90 KB | #### | 100% asttokens-2.0.5 | 21 KB | #### | 100% yarl-1.7.2 | 127 KB | #### | 100% argon2-cffi-bindings | 34 KB | #### | 100% tensorboard-data-ser | 12 KB | #### | 100% terminado-0.15.0 | 28 KB | #### | 100% flit-core-3.7.1 | 44 KB | #### | 100% joblib-1.1.0 | 210 KB | #### | 100% _tflow_select-2.3.0 | 3 KB | #### | 100% prometheus_client-0. | 49 KB | #### | 100% scipy-1.8.1 | 27.2 MB | #### | 100% nbconvert-pandoc-6.5 | 4 KB | #### | 100% werkzeug-2.1.2 | 237 KB | #### | 100% pandoc-2.18 | 18.1 MB | #### | 100% libcurl-7.83.1 | 303 KB | #### | 100% pytorch-1.10.2 | 200.0 MB | #### | 100% keras-preprocessing- | 34 KB | #### | 100% nbconvert-core-6.5.0 | 425 KB | #### | 100% flatbuffers-2.0.6 | 1.9 MB | #### | 100% matplotlib-inline-0. | 11 KB | #### | 100% sacremoses-0.0.53 | 427 KB | #### | 100% tzdata-2022a | 121 KB | #### | 100% cffi-1.15.0 | 229 KB | #### | 100% jupyter-1.0.0 | 7 KB | #### | 100% tornado-6.1 | 651 KB | #### | 100% krb5-1.19.3 | 847 KB | #### | 100% requests-2.27.1 | 53 KB | #### | 100% tensorboard-2.6.0 | 5.0 MB | #### | 100% pyjwt-2.4.0 | 19 KB | #### | 100% multidict-6.0.2 | 47 KB | #### | 100% pyzmq-23.0.0 | 456 KB | #### | 100% pytz-2022.1 | 242 KB | #### | 100% rsa-4.8 | 31 KB | #### | 100% transformers-4.19.2 | 2.0 MB | #### | 100% pycparser-2.21 | 100 KB | #### | 100% libuv-1.43.0 | 365 KB | #### | 100% mkl-service-2.4.0 | 52 KB | #### | 100% jupyter_core-4.10.0 | 105 KB | #### | 100% decorator-5.1.1 | 12 KB | #### | 100% pyparsing-3.0.9 | 79 KB | #### | 100% grpcio-1.46.3 | 2.0 MB | #### | 100% jedi-0.18.1 | 994 KB | #### | 100% icu-68.2 | 16.4 MB | #### | 100% jpeg-9e | 348 KB | #### | 100% dataclasses-0.8 | 10 KB | #### | 100% pysocks-1.7.1 | 28 KB | #### | 100% huggingface_hub-0.7. | 65 KB | #### | 100% yaml-0.2.5 | 62 KB | #### | 100% future-0.18.2 | 742 KB | #### | 100% intel-openmp-2022.1. | 3.7 MB | #### | 100% openpyxl-3.0.9 | 153 KB | #### | 100% traitlets-5.2.1.post | 85 KB | #### | 100% tinycss2-1.1.1 | 23 KB | #### | 100% markupsafe-2.1.1 | 25 KB | #### | 100% jinja2-3.1.2 | 99 KB | #### | 100% numpy-1.22.4 | 6.1 MB | #### | 100% ninja-1.11.0 | 300 KB | #### | 100% brotlipy-0.7.0 | 329 KB | #### | 100% tqdm-4.64.0 | 81 KB | #### | 100% qtpy-2.1.0 | 43 KB | #### | 100% certifi-2022.5.18.1 | 151 KB | #### | 100% pure_eval-0.2.2 | 14 KB | #### | 100% sqlite-3.38.5 | 1.3 MB | #### | 100% tbb-2021.5.0 | 148 KB | #### | 100% protobuf-3.14.0 | 261 KB | #### | 100% python-3.9.13 | 17.9 MB | #### | 100% mistune-0.8.4 | 55 KB | #### | 100% tensorflow-estimator | 288 KB | #### | 100% async-timeout-4.0.2 | 9 KB | #### | 100% oauthlib-3.2.0 | 90 KB | #### | 100% importlib_metadata-4 | 4 KB | #### | 100% h5py-3.6.0 | 1.1 MB | #### | 100% nbconvert-6.5.0 | 6 KB | #### | 100% typing-extensions-4. | 8 KB | #### | 100% tensorflow-base-2.6. | 110.3 MB | #### | 100% pyopenssl-22.0.0 | 49 KB | #### | 100% importlib-metadata-4 | 33 KB | #### | 100% jsonschema-4.5.1 | 57 KB | #### | 100% prompt_toolkit-3.0.2 | 5 KB | #### | 100% pywin32-303 | 6.9 MB | #### | 100% giflib-5.2.1 | 85 KB | #### | 100% snappy-1.1.9 | 55 KB | #### | 100% win_inet_pton-1.1.0 | 9 KB | #### | 100% qtconsole-5.3.0 | 5 KB | #### | 100% frozenlist-1.3.0 | 40 KB | #### | 100% absl-py-1.0.0 | 95 KB | #### | 100% pip-22.1.1 | 1.5 MB | #### | 100% notebook-6.4.11 | 6.3 MB | #### | 100% urllib3-1.26.9 | 100 KB | #### | 100% debugpy-1.6.0 | 3.2 MB | #### | 100% stack_data-0.2.0 | 21 KB | #### | 100% ipykernel-6.13.0 | 186 KB | #### | 100% cached_property-1.5. | 11 KB | #### | 100% zlib-1.2.12 | 110 KB | #### | 100% packaging-21.3 | 36 KB | #### | 100% pygments-2.12.0 | 817 KB | #### | 100% jupyterlab_widgets-1 | 133 KB | #### | 100% google-auth-1.35.0 | 81 KB | #### | 100% importlib_resources- | 22 KB | #### | 100% ipython-8.4.0 | 1.1 MB | #### | 100% pandocfilters-1.5.0 | 11 KB | #### | 100% jupyter_console-6.4. | 23 KB | #### | 100% psutil-5.9.1 | 370 KB | #### | 100% pandas-1.4.2 | 11.0 MB | #### | 100% nbclient-0.6.3 | 65 KB | #### | 100% zipp-3.8.0 | 12 KB | #### | 100% executing-0.8.3 | 18 KB | #### | 100% opt_einsum-3.3.0 | 53 KB | #### | 100% python-flatbuffers-1 | 19 KB | #### | 100% widgetsnbextension-3 | 1.2 MB | #### | 100% cached-property-1.5. | 4 KB | #### | 100% typing_extensions-4. | 27 KB | #### | 100% regex-2022.4.24 | 343 KB | #### | 100% parso-0.8.3 | 69 KB | #### | 100% setuptools-62.3.2 | 1.4 MB | #### | 100% liblapack-3.9.0 | 4.5 MB | #### | 100% charset-normalizer-2 | 35 KB | #### | 100% pyqt-5.12.3 | 4.8 MB | #### | 100% pyyaml-6.0 | 154 KB | #### | 100% blinker-1.4 | 13 KB | #### | 100% pyu2f-0.1.5 | 31 KB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: / Enabling notebook extension jupyter-js-widgets/extension... - Validating: ok done # # To activate this environment, use # # $ conda activate transformers # # To deactivate an active environment, use # # $ conda deactivate (base) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter> (base) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter>conda activate transformers (transformers) C:\Users\Ashish Jain\OneDrive\Desktop\jupyter>python -m ipykernel install --user --name transformers Installed kernelspec transformers in C:\Users\Ashish Jain\AppData\Roaming\jupyter\kernels\transformers
Monday, May 30, 2022
Installing Python Package 'transformers' for BERT
Tags: Technology,Artificial Intelligence,Machine Learning,Natural Language Processing,Python,TRIAL 1: Failure
Using Conda Prompt: conda install -c huggingface transformers Using YAML file: name: transformers channels: - conda-forge dependencies: - pip - pip: - transformers LOGS: (base) C:\Users\ash\Desktop>conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.13.0 Please update conda by running $ conda update -n base -c defaults conda Downloading and Extracting Packages libzlib-1.2.12 | 67 KB | #### | 100% setuptools-62.3.2 | 1.4 MB | #### | 100% xz-5.2.5 | 211 KB | #### | 100% libffi-3.4.2 | 41 KB | #### | 100% bzip2-1.0.8 | 149 KB | #### | 100% tzdata-2022a | 121 KB | #### | 100% ucrt-10.0.20348.0 | 1.2 MB | #### | 100% vc-14.2 | 13 KB | #### | 100% tk-8.6.12 | 3.5 MB | #### | 100% python_abi-3.10 | 4 KB | #### | 100% sqlite-3.38.5 | 1.3 MB | #### | 100% vs2015_runtime-14.29 | 1.3 MB | #### | 100% wheel-0.37.1 | 31 KB | #### | 100% openssl-3.0.3 | 10.0 MB | #### | 100% ca-certificates-2022 | 180 KB | #### | 100% python-3.10.4 | 16.2 MB | #### | 100% pip-22.1.1 | 1.5 MB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done Installing pip dependencies: \ Ran pip subprocess with arguments: ['C:\\Users\\ash\\Anaconda3\\envs\\transformers\\python.exe', '-m', 'pip', 'install', '-U', '-r', 'C:\\Users\\ash\\Desktop\\condaenv.xzuashl6.requirements.txt'] Pip subprocess output: Collecting transformers Downloading transformers-4.19.2-py3-none-any.whl (4.2 MB) ---------------------------------------- 4.2/4.2 MB 2.6 MB/s eta 0:00:00 Collecting tqdm>=4.27 Downloading tqdm-4.64.0-py2.py3-none-any.whl (78 kB) ---------------------------------------- 78.4/78.4 kB 1.1 MB/s eta 0:00:00 Collecting pyyaml>=5.1 Downloading PyYAML-6.0-cp310-cp310-win_amd64.whl (151 kB) -------------------------------------- 151.7/151.7 kB 1.8 MB/s eta 0:00:00 Collecting regex!=2019.12.17 Downloading regex-2022.4.24-cp310-cp310-win_amd64.whl (262 kB) -------------------------------------- 262.0/262.0 kB 3.2 MB/s eta 0:00:00 Collecting requests Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB) ---------------------------------------- 63.1/63.1 kB 3.3 MB/s eta 0:00:00 Collecting numpy>=1.17 Downloading numpy-1.22.4-cp310-cp310-win_amd64.whl (14.7 MB) ---------------------------------------- 14.7/14.7 MB 2.9 MB/s eta 0:00:00 Collecting packaging>=20.0 Downloading packaging-21.3-py3-none-any.whl (40 kB) -------------------------------------- 40.8/40.8 kB 984.2 kB/s eta 0:00:00 Collecting tokenizers!=0.11.3,<0.13,>=0.11.1 Downloading tokenizers-0.12.1-cp310-cp310-win_amd64.whl (3.3 MB) ---------------------------------------- 3.3/3.3 MB 2.4 MB/s eta 0:00:00 Collecting filelock Downloading filelock-3.7.0-py3-none-any.whl (10 kB) Collecting huggingface-hub<1.0,>=0.1.0 Downloading huggingface_hub-0.7.0-py3-none-any.whl (86 kB) ---------------------------------------- 86.2/86.2 kB 1.2 MB/s eta 0:00:00 Collecting typing-extensions>=3.7.4.3 Downloading typing_extensions-4.2.0-py3-none-any.whl (24 kB) Collecting pyparsing!=3.0.5,>=2.0.2 Downloading pyparsing-3.0.9-py3-none-any.whl (98 kB) ---------------------------------------- 98.3/98.3 kB 1.9 MB/s eta 0:00:00 Collecting colorama Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB) Collecting certifi>=2017.4.17 Downloading certifi-2022.5.18.1-py3-none-any.whl (155 kB) -------------------------------------- 155.2/155.2 kB 2.3 MB/s eta 0:00:00 Collecting charset-normalizer~=2.0.0 Downloading charset_normalizer-2.0.12-py3-none-any.whl (39 kB) Collecting urllib3<1.27,>=1.21.1 Downloading urllib3-1.26.9-py2.py3-none-any.whl (138 kB) -------------------------------------- 139.0/139.0 kB 2.1 MB/s eta 0:00:00 Collecting idna<4,>=2.5 Downloading idna-3.3-py3-none-any.whl (61 kB) ---------------------------------------- 61.2/61.2 kB 1.6 MB/s eta 0:00:00 Installing collected packages: tokenizers, urllib3, typing-extensions, regex, pyyaml, pyparsing, numpy, idna, filelock, colorama, charset-normalizer, certifi, tqdm, requests, packaging, huggingface-hub, transformers Successfully installed certifi-2022.5.18.1 charset-normalizer-2.0.12 colorama-0.4.4 filelock-3.7.0 huggingface-hub-0.7.0 idna-3.3 numpy-1.22.4 packaging-21.3 pyparsing-3.0.9 pyyaml-6.0 regex-2022.4.24 requests-2.27.1 tokenizers-0.12.1 tqdm-4.64.0 transformers-4.19.2 typing-extensions-4.2.0 urllib3-1.26.9 done # # To activate this environment, use # # $ conda activate transformers # # To deactivate an active environment, use # # $ conda deactivate (base) C:\Users\ash\Desktop> -------------------------------------------- (base) C:\Users\ash\Desktop>conda activate transformers (transformers) C:\Users\ash\Desktop>pip install ipykernel jupyter (transformers) C:\Users\ash\Desktop>python -m ipykernel install --user --name transformers Installed kernelspec transformers in C:\Users\ash\AppData\Roaming\jupyter\kernels\transformers -------------------------------------------- TESTING IN PYTHON: >>> import transformers as ppb None of PyTorch, TensorFlow >= 2.0, or Flax have been found. Models won't be available and only tokenizers, configuration and file/data utilities can be used. -------------------------------------- (transformers) C:\Users\ash>conda install -c conda-forge tensorflow Collecting package metadata (current_repodata.json): done Solving environment: failed with initial frozen solve. Retrying with flexible solve. Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source. Collecting package metadata (repodata.json): done Solving environment: failed with initial frozen solve. Retrying with flexible solve. Solving environment: - Found conflicts! Looking for incompatible packages. This can take several minutes. Press CTRL-C to abort.\ failed UnsatisfiableError: The following specifications were found to be incompatible with the existing python installation in your environment: Specifications: - tensorflow -> python[version='3.5.*|3.6.*|>=3.5,<3.6.0a0|>=3.6,<3.7.0a0|>=3.7,<3.8.0a0|3.8.*|3.7.*|3.9.*'] Your python: python=3.10 If python is on the left-most side of the chain, that's the version you've asked for. When python appears to the right, that indicates that the thing on the left is somehow not available for the python version you are constrained to. Note that conda will not change your python version to a different minor version unless you explicitly specify that. -------------------------------------TRIAL 2: Success
$ conda env remove -n transformers --all ENV.YML: name: transformers channels: - conda-forge dependencies: - python=3.9 - pip - pandas - pip: - transformers - tensorflow ALTERNATIVE (NOT TRIED) ENV.YML FILE: name: transformers channels: - conda-forge dependencies: - python=3.9 - pip - pandas - openpyxl - ipykernel - jupyter - tensorflow - pip: - transformers LOGS: (base) C:\Users\ash\Desktop>conda env create -f env.yml Collecting package metadata (repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.13.0 Please update conda by running $ conda update -n base -c defaults conda Downloading and Extracting Packages setuptools-62.3.2 | 1.4 MB | #### | 100% python-3.9.13 | 17.9 MB | #### | 100% python_abi-3.9 | 4 KB | #### | 100% pandas-1.4.2 | 11.0 MB | #### | 100% numpy-1.22.4 | 6.1 MB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done Installing pip dependencies: / Ran pip subprocess with arguments: ['C:\\Users\\ash\\Anaconda3\\envs\\transformers\\python.exe', '-m', 'pip', 'install', '-U', '-r', 'C:\\Users\\ash\\Desktop\\condaenv.m0blf3oh.requirements.txt'] Pip subprocess output: Collecting transformers Using cached transformers-4.19.2-py3-none-any.whl (4.2 MB) Collecting tensorflow Downloading tensorflow-2.9.1-cp39-cp39-win_amd64.whl (444.0 MB) -------------------------------------- 444.0/444.0 MB 1.7 MB/s eta 0:00:00 Collecting requests Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB) Collecting regex!=2019.12.17 Downloading regex-2022.4.24-cp39-cp39-win_amd64.whl (262 kB) -------------------------------------- 262.1/262.1 kB 2.7 MB/s eta 0:00:00 Collecting tokenizers!=0.11.3,<0.13,>=0.11.1 Downloading tokenizers-0.12.1-cp39-cp39-win_amd64.whl (3.3 MB) ---------------------------------------- 3.3/3.3 MB 3.0 MB/s eta 0:00:00 Collecting filelock Using cached filelock-3.7.0-py3-none-any.whl (10 kB) Collecting pyyaml>=5.1 Downloading PyYAML-6.0-cp39-cp39-win_amd64.whl (151 kB) -------------------------------------- 151.6/151.6 kB 3.0 MB/s eta 0:00:00 Collecting tqdm>=4.27 Using cached tqdm-4.64.0-py2.py3-none-any.whl (78 kB) Collecting packaging>=20.0 Using cached packaging-21.3-py3-none-any.whl (40 kB) Collecting huggingface-hub<1.0,>=0.1.0 Using cached huggingface_hub-0.7.0-py3-none-any.whl (86 kB) Requirement already satisfied: numpy>=1.17 in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from transformers->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 1)) (1.22.4) Requirement already satisfied: six>=1.12.0 in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from tensorflow->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 2)) (1.16.0) Collecting termcolor>=1.1.0 Downloading termcolor-1.1.0.tar.gz (3.9 kB) Preparing metadata (setup.py): started Preparing metadata (setup.py): finished with status 'done' Collecting tensorflow-io-gcs-filesystem>=0.23.1 Downloading tensorflow_io_gcs_filesystem-0.26.0-cp39-cp39-win_amd64.whl (1.5 MB) ---------------------------------------- 1.5/1.5 MB 3.0 MB/s eta 0:00:00 Collecting protobuf<3.20,>=3.9.2 Downloading protobuf-3.19.4-cp39-cp39-win_amd64.whl (895 kB) -------------------------------------- 895.7/895.7 kB 2.0 MB/s eta 0:00:00 Collecting absl-py>=1.0.0 Downloading absl_py-1.0.0-py3-none-any.whl (126 kB) -------------------------------------- 126.7/126.7 kB 1.1 MB/s eta 0:00:00 Collecting typing-extensions>=3.6.6 Using cached typing_extensions-4.2.0-py3-none-any.whl (24 kB) Collecting libclang>=13.0.0 Downloading libclang-14.0.1-py2.py3-none-win_amd64.whl (14.2 MB) -------------------------------------- 14.2/14.2 MB 701.7 kB/s eta 0:00:00 Collecting astunparse>=1.6.0 Downloading astunparse-1.6.3-py2.py3-none-any.whl (12 kB) Collecting google-pasta>=0.1.1 Downloading google_pasta-0.2.0-py3-none-any.whl (57 kB) ---------------------------------------- 57.5/57.5 kB 1.5 MB/s eta 0:00:00 Requirement already satisfied: setuptools in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from tensorflow->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 2)) (62.3.2) Collecting tensorflow-estimator<2.10.0,>=2.9.0rc0 Downloading tensorflow_estimator-2.9.0-py2.py3-none-any.whl (438 kB) -------------------------------------- 438.7/438.7 kB 2.7 MB/s eta 0:00:00 Collecting tensorboard<2.10,>=2.9 Downloading tensorboard-2.9.0-py3-none-any.whl (5.8 MB) ---------------------------------------- 5.8/5.8 MB 2.9 MB/s eta 0:00:00 Collecting opt-einsum>=2.3.2 Downloading opt_einsum-3.3.0-py3-none-any.whl (65 kB) ---------------------------------------- 65.5/65.5 kB 1.2 MB/s eta 0:00:00 Collecting gast<=0.4.0,>=0.2.1 Downloading gast-0.4.0-py3-none-any.whl (9.8 kB) Collecting wrapt>=1.11.0 Downloading wrapt-1.14.1-cp39-cp39-win_amd64.whl (35 kB) Collecting grpcio<2.0,>=1.24.3 Downloading grpcio-1.46.3-cp39-cp39-win_amd64.whl (3.5 MB) ---------------------------------------- 3.5/3.5 MB 2.7 MB/s eta 0:00:00 Collecting keras-preprocessing>=1.1.1 Downloading Keras_Preprocessing-1.1.2-py2.py3-none-any.whl (42 kB) ---------------------------------------- 42.6/42.6 kB 1.0 MB/s eta 0:00:00 Collecting h5py>=2.9.0 Downloading h5py-3.7.0-cp39-cp39-win_amd64.whl (2.6 MB) ---------------------------------------- 2.6/2.6 MB 2.8 MB/s eta 0:00:00 Collecting flatbuffers<2,>=1.12 Downloading flatbuffers-1.12-py2.py3-none-any.whl (15 kB) Collecting keras<2.10.0,>=2.9.0rc0 Downloading keras-2.9.0-py2.py3-none-any.whl (1.6 MB) ---------------------------------------- 1.6/1.6 MB 2.7 MB/s eta 0:00:00 Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\users\ash\anaconda3\envs\transformers\lib\site-packages (from astunparse>=1.6.0->tensorflow->-r C:\Users\ash\Desktop\condaenv.m0blf3oh.requirements.txt (line 2)) (0.37.1) Collecting pyparsing!=3.0.5,>=2.0.2 Using cached pyparsing-3.0.9-py3-none-any.whl (98 kB) Collecting google-auth<3,>=1.6.3 Downloading google_auth-2.6.6-py2.py3-none-any.whl (156 kB) -------------------------------------- 156.7/156.7 kB 2.4 MB/s eta 0:00:00 Collecting tensorboard-plugin-wit>=1.6.0 Downloading tensorboard_plugin_wit-1.8.1-py3-none-any.whl (781 kB) -------------------------------------- 781.3/781.3 kB 3.3 MB/s eta 0:00:00 Collecting markdown>=2.6.8 Downloading Markdown-3.3.7-py3-none-any.whl (97 kB) ---------------------------------------- 97.8/97.8 kB 1.4 MB/s eta 0:00:00 Collecting tensorboard-data-server<0.7.0,>=0.6.0 Downloading tensorboard_data_server-0.6.1-py3-none-any.whl (2.4 kB) Collecting werkzeug>=1.0.1 Downloading Werkzeug-2.1.2-py3-none-any.whl (224 kB) -------------------------------------- 224.9/224.9 kB 2.3 MB/s eta 0:00:00 Collecting google-auth-oauthlib<0.5,>=0.4.1 Downloading google_auth_oauthlib-0.4.6-py2.py3-none-any.whl (18 kB) Collecting idna<4,>=2.5 Using cached idna-3.3-py3-none-any.whl (61 kB) Collecting certifi>=2017.4.17 Using cached certifi-2022.5.18.1-py3-none-any.whl (155 kB) Collecting urllib3<1.27,>=1.21.1 Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB) Collecting charset-normalizer~=2.0.0 Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB) Collecting colorama Using cached colorama-0.4.4-py2.py3-none-any.whl (16 kB) Collecting pyasn1-modules>=0.2.1 Downloading pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB) -------------------------------------- 155.3/155.3 kB 2.3 MB/s eta 0:00:00 Collecting cachetools<6.0,>=2.0.0 Downloading cachetools-5.2.0-py3-none-any.whl (9.3 kB) Collecting rsa<5,>=3.1.4 Downloading rsa-4.8-py3-none-any.whl (39 kB) Collecting requests-oauthlib>=0.7.0 Downloading requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB) Collecting importlib-metadata>=4.4 Downloading importlib_metadata-4.11.4-py3-none-any.whl (18 kB) Collecting zipp>=0.5 Downloading zipp-3.8.0-py3-none-any.whl (5.4 kB) Collecting pyasn1<0.5.0,>=0.4.6 Downloading pyasn1-0.4.8-py2.py3-none-any.whl (77 kB) ---------------------------------------- 77.1/77.1 kB 2.2 MB/s eta 0:00:00 Collecting oauthlib>=3.0.0 Downloading oauthlib-3.2.0-py3-none-any.whl (151 kB) -------------------------------------- 151.5/151.5 kB 3.0 MB/s eta 0:00:00 Building wheels for collected packages: termcolor Building wheel for termcolor (setup.py): started Building wheel for termcolor (setup.py): finished with status 'done' Created wheel for termcolor: filename=termcolor-1.1.0-py3-none-any.whl size=4832 sha256=34e6470d92e16cedf1b846cf239d01ce6c05ddff3b0ec5437ceff54ea7de2d15 Stored in directory: c:\users\ash\appdata\local\pip\cache\wheels\b6\0d\90\0d1bbd99855f99cb2f6c2e5ff96f8023fad8ec367695f7d72d Successfully built termcolor Installing collected packages: tokenizers, termcolor, tensorboard-plugin-wit, pyasn1, libclang, keras, flatbuffers, zipp, wrapt, werkzeug, urllib3, typing-extensions, tensorflow-io-gcs-filesystem, tensorflow-estimator, tensorboard-data-server, rsa, regex, pyyaml, pyparsing, pyasn1-modules, protobuf, opt-einsum, oauthlib, keras-preprocessing, idna, h5py, grpcio, google-pasta, gast, filelock, colorama, charset-normalizer, certifi, cachetools, astunparse, absl-py, tqdm, requests, packaging, importlib-metadata, google-auth, requests-oauthlib, markdown, huggingface-hub, transformers, google-auth-oauthlib, tensorboard, tensorflow Successfully installed absl-py-1.0.0 astunparse-1.6.3 cachetools-5.2.0 certifi-2022.5.18.1 charset-normalizer-2.0.12 colorama-0.4.4 filelock-3.7.0 flatbuffers-1.12 gast-0.4.0 google-auth-2.6.6 google-auth-oauthlib-0.4.6 google-pasta-0.2.0 grpcio-1.46.3 h5py-3.7.0 huggingface-hub-0.7.0 idna-3.3 importlib-metadata-4.11.4 keras-2.9.0 keras-preprocessing-1.1.2 libclang-14.0.1 markdown-3.3.7 oauthlib-3.2.0 opt-einsum-3.3.0 packaging-21.3 protobuf-3.19.4 pyasn1-0.4.8 pyasn1-modules-0.2.8 pyparsing-3.0.9 pyyaml-6.0 regex-2022.4.24 requests-2.27.1 requests-oauthlib-1.3.1 rsa-4.8 tensorboard-2.9.0 tensorboard-data-server-0.6.1 tensorboard-plugin-wit-1.8.1 tensorflow-2.9.1 tensorflow-estimator-2.9.0 tensorflow-io-gcs-filesystem-0.26.0 termcolor-1.1.0 tokenizers-0.12.1 tqdm-4.64.0 transformers-4.19.2 typing-extensions-4.2.0 urllib3-1.26.9 werkzeug-2.1.2 wrapt-1.14.1 zipp-3.8.0 done # # To activate this environment, use # # $ conda activate transformers # # To deactivate an active environment, use # # $ conda deactivate (base) C:\Users\ash\Desktop>conda activate transformers (transformers) C:\Users\ash\Desktop>conda install -c conda-forge jupyter ipykernel (transformers) C:\Users\ash\Desktop>python -m ipykernel install --user --name transformers Installed kernelspec transformers in C:\Users\ash\AppData\Roaming\jupyter\kernels\transformersTESTING LOGS
import warnings warnings.filterwarnings('ignore') print(ppb.__version__) # 4.19.2 model_class, tokenizer_class, pretrained_weights = (ppb.BertModel, ppb.BertTokenizer, 'bert-base-uncased') tokenizer = tokenizer_class.from_pretrained(pretrained_weights) model = model_class.from_pretrained(pretrained_weights) OUTPUT: Downloading: 100% 226k/226k [00:01<00:00, 253kB/s] Downloading: 100% 28.0/28.0 [00:00<00:00, 921B/s] Downloading: 100% 570/570 [00:00<00:00, 14.5kB/s] --------------------------------------------------------------------------- ImportError Traceback (most recent call last) Input In [9], in <cell line: 2>() 1 tokenizer = tokenizer_class.from_pretrained(pretrained_weights) ----> 2 model = model_class.from_pretrained(pretrained_weights) File ~\Anaconda3\envs\transformers\lib\site-packages\transformers\utils\import_utils.py:788, in DummyObject.__getattr__(cls, key) 786 if key.startswith("_"): 787 return super().__getattr__(cls, key) --> 788 requires_backends(cls, cls._backends) File ~\Anaconda3\envs\transformers\lib\site-packages\transformers\utils\import_utils.py:776, in requires_backends(obj, backends) 774 failed = [msg.format(name) for available, msg in checks if not available()] 775 if failed: --> 776 raise ImportError("".join(failed)) ImportError: BertModel requires the PyTorch library but it was not found in your environment. Checkout the instructions on the installation page: https://pytorch.org/get-started/locally/ and follow the ones that match your environment. FIX: (transformers) C:\Users\ash>conda install -c pytorch pytorch Collecting package metadata (current_repodata.json): done Solving environment: done ==> WARNING: A newer version of conda exists. <== current version: 4.12.0 latest version: 4.13.0 Please update conda by running $ conda update -n base -c defaults conda ## Package Plan ## environment location: C:\Users\ash\Anaconda3\envs\transformers added / updated specs: - pytorch The following packages will be downloaded: package | build ---------------------------|----------------- cudatoolkit-11.3.1 | h59b6b97_2 545.3 MB libuv-1.40.0 | he774522_0 255 KB openssl-1.1.1o | h2bbff1b_0 4.8 MB pytorch-1.11.0 |py3.9_cuda11.3_cudnn8_0 1.23 GB pytorch pytorch-mutex-1.0 | cuda 3 KB pytorch ------------------------------------------------------------ Total: 1.77 GB The following NEW packages will be INSTALLED: blas pkgs/main/win-64::blas-1.0-mkl cudatoolkit pkgs/main/win-64::cudatoolkit-11.3.1-h59b6b97_2 libuv pkgs/main/win-64::libuv-1.40.0-he774522_0 pytorch pytorch/win-64::pytorch-1.11.0-py3.9_cuda11.3_cudnn8_0 pytorch-mutex pytorch/noarch::pytorch-mutex-1.0-cuda typing_extensions pkgs/main/noarch::typing_extensions-4.1.1-pyh06a4308_0 The following packages will be SUPERSEDED by a higher-priority channel: openssl conda-forge::openssl-1.1.1o-h8ffe710_0 --> pkgs/main::openssl-1.1.1o-h2bbff1b_0 Proceed ([y]/n)? y Downloading and Extracting Packages libuv-1.40.0 | 255 KB | #### | 100% openssl-1.1.1o | 4.8 MB | #### | 100% pytorch-mutex-1.0 | 3 KB | #### | 100% cudatoolkit-11.3.1 | 545.3 MB | #### | 100% pytorch-1.11.0 | 1.23 GB | #### | 100% Preparing transaction: done Verifying transaction: done Executing transaction: done (transformers) C:\Users\ash>
Sunday, May 29, 2022
Add vectors - magnitude & direction to component
Ques 1: Ans: Ques 2: Ans 2: Ques 3: Ans 3: Ques 4: Ans 4:Tags: Mathematical Foundations for Data Science,
Subscribe to:
Posts (Atom)