terça-feira, 17 de outubro de 2017



Where’s the Curiosity?

Curiosity - who, what, when, where and why
“To love the past may easily be an expression of the nostalgic romanticism of old men and old societies, a symptom of loss of faith and interest in the present or future.”
— E. H. Carr, What is History?

When I walk into a college classroom these days it is as quiet as libraries used to be. Every head is bowed and every thumb is scrolling away on a screen. There are few, if any, conversations between students. No one looks up until I take attendance. When the class is over the students depart as silently as they came. Even if they are drifting in the same direction, they rarely talk to one another. It’s almost impossible to catch the eye of a student walking toward me and, if I break the spell with a cheery “Hello!”, there is a startle reflex that reveals the depth of the self-isolation.

In one of his most famous essays, Michel Montaigne says, “To my taste the most fruitful and most natural exercise of our minds is conversation. I find the practice of it the most delightful activity in our lives.” He likens a good conversation to vigorous sparring, his “strong and solid opponent will attack me on the flanks, stick his lance in me right and left; his ideas send mine soaring.” There is joy in the friction of ideas, a closeness made possible by an unspoken willingness to play up and higher than one’s own level of thought. “In conversation,” says Montaigne, “the most painful quality is perfect harmony.”

Montaigne’s essay, “The Art of Conversation,” is a primer for civilized and vigorous conversation. It should be read and reread, taught at the high school level, and carried about in one’s pocket. I would be delighted if it were read into the Congressional Record, but its good-natured guidelines would be obliterated or simply ignored. We don’t know how to talk with one another about things that matter.

“You cannot teach a man that which he thinks he already knows,” said Epictetus, a Stoic philosopher who taught humility as a prerequisite to learning. Epictetus (55-135 A.D.) was striking at the arrogance that prevents us from even considering unfamiliar ideas. Today his words would be shrugged off: we make no claim to know but we know where to look it up—and that’s the end of it. No need to look elsewhere; the goal of education has become to simply tag what comes up in response to a query. Decades of teaching to the test and demanding quantifiable results have taught our students to take the quickest route to the first answer that presents itself on Google.

Montaigne, who positively reveled in being contradicted by another because “he is instructing me,” wouldn’t have known what to do with humility in conversation. “My thought so often contradicts and condemns itself that it is all one to me if someone else does so, seeing that I give to his refutation only such authority as I please.” Montaigne didn’t need to give up anything in order to learn from another, especially not pride, because what mattered to him in conversation was the diligence in finding the truth, no matter where it came from.

If Montaigne were here today, I would take issue with him about conversation and communication. Especially as it concerns conversation in schools and universities, I would ask him to look farther back and deeper down to the seed of learning, which is curiosity. Actually, come to think of it, Epictetus’ call for humility first could be set aside if only the student had curiosity.

In learning through curiosity, there is no need to curb one’s arrogance first. If we are truly curious, then arrogance plays no part in thwarting us in the pursuit of truth. Curiosity is the engine of learning: without it we wouldn’t be clothed, sitting at our computers, able to tap into the world’s knowledge at the touch of a screen. But finding the facts is only the beginning for the curious learner. Curiosity isn’t satisfied with just the facts, but hungers for meaning, context, and application beyond the immediate problem. Curiosity is the air that the sciences breathe, but it’s the heart that beats within the humanities.

I suspect that what generates even the briefest conversation among strangers is curiosity about what another is experiencing and feeling and thinking. Surely, it’s curiosity that could break through our ear-budded walls and lure us into conversation about . . . anything. Within the classroom, it’s curiosity that would go beyond the facts to ask not only how and where, but when—and most importantly—why. I do not see that kind of curiosity very often.

Recently, as I left a class in which the legacy of Martin Luther King, Jr., and the life’s work of one of his mentors, Myles Horton, was the subject of discussion, and in which only one student commented, asked questions, and generally showed the fire of curiosity, I decided I was not angry or frustrated or wounded or even nostalgic. I was wistful for the leap and thrust and joyful friction of real conversation.

Barry Casey teaches philosophy and ethics at Trinity Washington University in Washington, D.C. He also teaches public speaking at Stevenson University in Maryland.

segunda-feira, 16 de outubro de 2017

Andragogy vs. Pedagogy

 Selected Moments of the 20th Century
A work in progress edited by Daniel Schugurensky Department of Adult Education, Community Development and Counselling Psychology, The Ontario Institute for Studies in Education of the University of Toronto (OISE/UT)

 Malcolm Knowles publishes The Modern Practice of Adult Education: Andragogy vs. Pedagogy

Although the term andragogy has been in use since 1833, it was Malcolm S. Knowles, who in his 1970 book, The Modern Practice of Adult Education: Andragogy vs. Pedagogy, first popularized the term in North America and organized the concepts into a comprehensive theory. Knowles devised a set of four assumptions that differentiated adults from children as learners. Knowles' four assumptions necessitated an entirely new approach to teaching adults and were hailed as groundbreaking in the field of adult education at the time. The four assumptions are self-concept, experience, readiness to learn and orientation to learning (Knowles, 1970, p.39). This essay will examine how different or new Knowles' theory of andragogy was compared to the progressive education theorists such as John Dewey, Maria Montessori and Alexander Neill. The progressive education theorists were influential in the early part of the twentieth century, long before Knowles, but their ideas centered on children rather than adults. Comparing and contrasting Knowles' four main andragogical assumptions with the ideas of Dewey, Montessori and Neill reveals the extent to which andragogy contributed new approaches to the field of adult education. An in-depth comparison of Knowles' andragogy with the progressive education theorists also reveals the extent to which Knowles' theory is specific to adults. In the end, Knowles' ideas did not offer anything new to the field of adult education nor did they contribute useful information as to how adults learn differently than children. Knowles is often closely associated with andragogy, but in fact, it originated much earlier. Alexander Kapp, a German grammar school teacher, first introduced the term andragogy in Germany in 1833, but it referred broadly to adults as learners rather than to any specific style of learning (Draper, 1998). Pedagogy was the common term used for teaching at the time. Pedagogy means "the science or art of teaching" and a direct translation of the term from the Greek root of "peda" means a "boy" or "child"(Selman et. al, 1998, p.19). Andragogy, as a term, was developed in reaction to pedagogy in order to move away from its close association with children. Andragogy means "the science or art of teaching adults" because the Greek root of "andra" means adults (p.19). Knowles, however, was the first to develop the ideas of andragogy into a detailed strategy for teaching adults and to popularize the term in North America (Draper, 1998). Knowles felt that pedagogy was "based on a now obsolete premise - that is, the idea that the purpose of education is to transmit culture" (Draper, 1998, p.14).

Knowles believed that an andragogical approach to teaching adults was vitally important in order to take the adult's learning needs into account and to "teach adults how to learn" (Knowles, 1970, p.39). The transmissional modes of pedagogy were not considered as sufficient in enabling adults to deal with the rapid change going on in our society. The result of the "timespan of major cultural change" changing more than once in one's lifetime called for a new and more successful approach to adult learning (p.39). Knowles' felt this development necessitated a new approach to learning based on a more accurate set of assumptions of adults as learners. The four assumptions Knowles uses to distinguish adults from children as learners produced a split between andragogy and pedagogy. The self-concept principle reflects the self-directing character of the adult learner rather than dependent nature of the child. The principle of experience simply acknowledges the need to draw on the adult's rich source of experience. In contrast, the pedagogical framework perceives the child as not possessing sufficient life experience to effectively incorporate into the learning environment. Readiness to learn indicates that adults differ from children in their developmental stage and as a result have special learning needs. The assumption implies that adult learning needs tend to focus more towards their social roles. On the other hand, pedagogy claims that the learning needs of children are geared towards physiological and mental development stages. Orientation to learning assumes that adults put more value on being able to practically apply their learning while pedagogy suggests that children naturally focus on postponing immediate application for future needs. These four sets of assumptions establish significant guidelines for creating adult learning environments (Knowles, 1970, p.39). The concept that learning can be made more effective by taking advantage of the innate ability of learners to be self-directed did not originate from Knowles' theory of andragogy in 1970. Montessori, Dewey and Neill had all proclaimed the advantages of recognizing the self-directing capacity of learners and focused their attention on children as learners. Their arguments seriously undermine one of the key assumptions of andragogy; the assumption that adults are more self-directed than are children. Montessori felt that learners were predominantly self-motivated and that they generally learned by themselves (Montessori, 1955). She also acknowledged that the teacher only played a small part in the learner's overall environment (Montessori, 1955). Dewey felt it was important that the educator did not impose a direction on the student, but acted as more of a guide and an assistant (Dewey, The Child and the Curriculum, 1902). Montessori argued that, because of old prejudices, adults fail to recognize the self-directing capacity of children (Montessori, 1955). The oppressive and controlling habits of traditional education prevent adults from observing the child as an individual. Neill also believed strongly in the self-directing capacity of the child. In his experimental Summerhill School in the 1920s, Neill went so far as allowing children to choose whether to attend school (Neill, 1962). Therefore, the progressive education theorists believed that both children and adults are innately self-directing, but that it is our societal prejudices that prevent us from tapping into the self-directing nature of the child. The perception that an adult's experience should be made a central part of the learning process did not derive from Knowles. The progressive education theorists were far ahead of Knowles in developing thoughts on recognizing the learner's experience. Their work also supports the claim that the child's experience is equal to that of an adult's. John Dewey, in particular, was of the opinion that for learning to be effective it needs to be based on the learner's experience (Dewey, 1902). Dewey maintained that learning is merely symbolic if there is no relation to learning and a person's experience. Dewey criticized the tendency of "old education" to ignore the child's experience and to compare the child as immature in comparison to the adult (Dewey, 1902, p.12). Montessori's opinions support Dewey. Montessori argued that "we form a barrier not paying attention to the inner power of the child or what they can teach us" (Montessori, 1955, p.47). A comparison of Knowles' third assumption with the ideas of the progressive education theorists makes it clear that Knowles did not come up with anything original. The progressive education theorists do not specifically focus on the concept of readiness to learn in the same manner as Knowles, but an overview of their writings reveals the similarity of their thoughts on the subject. In Neill's Summerhill School, for instance, some classes were organized according to age, but others according to interests (Neill, 1962). It is logical to agree with Knowles that adults differ from children in their developmental stage, but this does not necessarily entail that adults require a different approach to teaching. The concept that learners learn best when they are able to apply learning did not begin with Knowles. Montessori and Dewey advocated the need for practical application of learning long before Knowles and again focused their thoughts on children as learners. Dewey felt that "the divorce between learning and its use is the most serious defect of our existing education. Without the consciousness of application, learning has no motive to the child." (Dewey, 1966a, p.73) Naturally, the same principles can be extended to all learners regardless of age. Neill also claimed that education should involve real doing (Neill, 1962). Dewey strongly believed that the educational system needed to better represent life. He also criticized the preoccupation with subject centered learning in "old education". He felt that the constant state of change in our society rendered much of subject centered learning obsolete (Dewey, 1966b). Curiously, his ideas of preparing people to deal more effectively with a continuously changing world is similar to Knowles' theory of "helping people to grow in their ability to learn" (Knowles, 1970, p. 34). Dewey thought that the prevalent thinking that adults are more suited to practical application than children reflects the beliefs of the "old education" system more than the actual needs of the learners. According to Dewey the child learner would be perfectly content to apply their learning to specific problems (Dewey, 1966b). We can easily conclude that age is no longer an adequate means of distinguishing andragogical from pedagogical assumptions. Many theorists, including Knowles in his later writings, argue that new definitions are needed to erase some of the confusion in the debate. The most common opinion, now held by Knowles among others, is to portray andragogy as self-directed learning and pedagogy as teacher directed learning (Rachal, 1983). This new and more accurate distinction brings us right back to the progressive education theorists. It was the progressive education theorists who first came up with the idea of distinguishing between progressive education and traditional education. In summary, their idea of progressive education represented self-directed learning and traditional education represented teacher-directed learning. We cannot ignore the impact of Knowles' theory on the field of education, but we can benefit from ignoring the confusion it created and focus on the ideas put forth by the progressive education theorists. UNESCO echoes a similar sentiment and now discourages use of the term andragogy (Draper, 1998). Comparing Knowles' andragogy with the ideas of the progressive education theorists also makes it clear that the main assumptions of andragogy are not specific to adults and that andragogy as a theory explaining adult learning did not contribute anything new to the field of adult education. 


Dewey John. (1902). The Child and the Curriculum. Chicago: University of Chicago Press.
Dewey, John. (1966a). "The Dewey School." Dewey's Educational Writings. Ed. by F.W. Garforth. London: Heinemann.
Dewey, John. (1966b). "My Pedagogic Creed". Dewey's Educational Writings. Ed. by F.W. Garforth. London: Heinemann.
Draper, James. (1998). "The Metamorphoses of Andragogy". Canadian Journal of Studies in Adult Education. Vol. 12 (1), May.
Knowles, Malcolm S. (1980) The Modern Practice of Adult Education. Revised and Updated. Englewood Cliffs: Prentice Hall Regents.
Knowles, Malcolm S. (1970). The Modern Practice of Adult Education: Andragogy Versus Pedagogy. New York: Association Press.
Montessori, Maria. (1955). Childhood Education. Translated by Joosten A.M., Chicago: Henry Regnery Co.
Neill, Alexander. (1962). Summerhill. Middlesex: Penguin Books.
Rachal, John. (1983). "The Andragogy - Pedagogy Debate: Another Voice in the Fray". Lifelong Learning: The Adult Years. Vol.6 (9), May.
Prepared by Jay Friedman (OISE/UT)

domingo, 15 de outubro de 2017

Reflective practice

Webinar: Creating a Culture of Reflective Practice
Join Pete Hall and Alisa Simeral, authors of Creating a Culture of Reflective Practice: Capacity-Building for Schoolwide Success, as they draw on lessons learned from educators around the United States and present a definitive guide to developing a schoolwide culture that both values reflection and uses it to ensure that teachers—and their students—reach their full potential.
Creating a Culture of Reflective Practice
Tuesday, October 24, 2017, 3:00 p.m. eastern time

Presented by
Pete Hall and Alisa Simeral

Register Now

sábado, 30 de setembro de 2017


Making the Most of Your Writing Feedback

Giving high-quality feedback on student writing can be a challenge, but these strategies help maximize its impact on your students.
A teacher discusses writing feedback with two students.

Providing feedback on student writing is one of the most important, most challenging aspects of a teacher’s job. It’s important because feedback is critical to student learning; it’s challenging because of time constraints and the number of students at varying levels in our classes.

Making sure your feedback is specific, ongoing, action-oriented, and reasonable—the SOAR method, a strategy I developed—helps maximize its impact on your students.


Feedback is often lost on students because it’s too vague. Comments like “great job,” “good writing,” or even “needs better organization” fall flat with students because they’re not tied to specific words, phrases, sentences, or paragraphs in writing.

Feedback falls into two categories: the what and the how. The what of writing deals with content. Specific feedback here includes guiding students if they need more evidence, stronger claims, or further analysis. If they’re writing fiction, you might suggest adding more dialogue for character development or further detail to establish setting.

The how is the writing itself. Specific feedback here can include comments concerning the organization of the information, rhetorical strategies, style, voice, and conventions.

Teachers who teach writing across the curriculum and feel uncomfortable assessing the how can focus their feedback on the what since that is the part of the writing they’ll feel most comfortable assessing.


Fortunately, teachers are slowly breaking away from grading final products only and are offering feedback throughout the different stages of writing. The biggest hindrance to ongoing feedback is time, but narrowing the focus of the feedback can help you meet this challenge.

For example, a science teacher may choose to focus on one particular section of a lab report each time; an English language arts teacher may choose to focus on one particular stage of writing, shifting from one stage to another throughout the course of the year.

Students can take the lead by asking for feedback on a Google doc throughout the writing process. Kaizena allows teachers to leave voice notes on a student paper, making it easy to check in and comment on work during different stages in the process.

Stations with self-guiding questions for reflection can be a great way to allow students to move through the writing process at their own pace, and the teacher can rotate through the stations, addressing small groups of students instead of the whole class.


Many teachers fall into the trap of editing student writing by focusing on marking grammar mistakes instead of offering feedback to move students forward in their writing. Helping students take specific steps is key in building a growth mindset in writing—students must see that the action taken can benefit their future writing and not just correct a mistake in the current paper.

Conferring is key when offering action-oriented feedback. I recently sat in the hall conferring with students about college essays. This allowed me time to say things like, “Notice in this paragraph how you begin seven sentences with ‘I’ followed by an action verb. How can you vary some of these sentences so they don’t all sound the same? The content is good, but let’s work on sentence variety.” The student would then offer a suggestion on how to start a sentence differently, and we would further discuss it. The student not only improves this piece but will be more likely to carry these ideas to future writing because the feedback results in an action step.

Teachers are not the only ones who can provide action-oriented feedback—peer editing with specific action-oriented writing suggestions can also move your writers forward.


When a student receives a paper with markings all over it, they can become overwhelmed and discouraged, which often prevents them from taking proper steps for revision or for growth as a writer in general. This can be remedied in several ways.

Select a focus for feedback with each assignment. Sometimes I’ll tell students, “I’ll be offering feedback on transitions in this paper,” or “The focus of feedback for this writing is on the amount of evidence.” I try to give feedback on both the what and the how of writing. This is especially strategic if I’ve given a mini-lesson on an area that I want students to focus on in their writing.

Another successful strategy is giving a Glow and a Grow comment highlighting a specific area that a student did particularly well on and one that needs improvement. Glow and Grow comments both celebrate and challenge student writing. Students refer to past Grow and Glow comments and goals before writing future assignments so they can be reminded of where they are strong, in order to continue doing these things well, and to be aware of areas for growth, in order to push themselves in these areas.

Feedback is more than a grade and should be one of the driving factors in helping students set learning goals and take charge of their own writing and learning. Help students SOAR to success in learning with quality feedback.

Hospital of the Future

The hospital of the future will find you

 Illustration: nicescene, Getty Images

Digital disruption is the new normal. We use video conferencing in global business meetings, summon car rides through our phones using GPS, and snap or tweet our lives to friends and followers.

For its part, healthcare is breaking down traditional hospital walls, and it’s not just the developed world leading this disruption. Indeed, the healthcare model for billions of people in the developing world has always been different. Lacking the massive and complicated hospital infrastructure of other regions, medical care in many parts of the world travels to the patient in the form of a visit from a local doctor or a stop at a rural clinic.

This “last mile of care” – where the hospital finds the patient, not the other way around – is made possible as medical innovation across the globe becomes increasingly mobile, digital, personal and accessible.

Healthcare solutions are becoming more digitally connected, affordable and convenient for both the patient and caregiver. A Journal of Hospital Librarianship study found that 85 percent of health care providers were already using smartphones and/or tablets in their daily work. One-third of health information exchange data is already in the cloud, according to a white paper from Cisco.

Digital health, in other words, is here. Data from remote monitoring devices, such as smart scales and blood pressure cuffs, are being transmitted to doctors around the world to improve patient outcomes. In remote areas across Latin America, cloud technology allows doctors to share ultrasound images with their patients and distant colleagues with the simple click of a button. Similarly, pocket-sized ultrasound technology is helping midwives in Africa determine if expectant mothers can deliver babies safely or need to go to the nearest hospital.

Big data, analytics and artificial intelligence enable health care to be more personalized and precise – a fact with which patients appear increasingly comfortable. Virtual assistants on our phones or kitchen counters are dispensing medical advice from WebMD, and a recent global PwC survey across 12 countries showed that nearly 40 percent of people trusted AI and robotics to administer a heart rhythm test and then make clinical recommendations. That hypothetical is already becoming a reality. A new algorithm server is helping medical professionals read patients’ ECGs remotely and AI is helping doctors diagnose lung cancer in China.

In emergency rooms and operating rooms across the world, machines are generating millions of data points, but only a small fraction are harvested and saved in hospitals’ electronic medical record systems. Gathering, analyzing and acting on this deluge of data is the next step. For instance, clinicians can now use cloud-based, algorithm-powered apps to pull hundreds of data points directly from anesthesia machines with every patient breath. These apps unlock actionable insights that can help clinicians with clinical, operational and economic improvements.

Mobile digital health is revolutionizing not just how and what care people get, but where they can receive it. Already, 70 percent of U.S. employers offer telehealth services, and a World Health Organization survey found that 87 percent of countries worldwide had at least one massive mobile health program underway.

While most acute care will continue to take place inside brick-and-mortar medical facilities, future generations will likely receive care virtually, and participate in their own care to greater degrees. For instance, subtle stick-on monitors that look like digital Band-Aids are being developed right now to help doctors remotely monitor key vital signs, from heart rate and blood pressure to sweat and oxygen levels.

Disruption is indeed the new normal for healthcare. By pairing new thinking with new technology in new clinical areas, we can make sure that future healthcare solutions are at once more personal, more digital and more globally connected. In a future where hospitals are everywhere and nowhere, and data is ubiquitous, care must and will come to the patient.

Bad news

Profissionais de saúde devem falar de forma simples e demonstrar acolhimento

CBN Comunicação e Liderança – Leny Kyrillos

Médicos e enfermeiros devem usar linguagem simples com os pacientes. 
Foto: Reprodução/Pixabay

Principalmente na hora de dar más notícias, é importante olhar nos olhos, direcionar o tronco para o interlocutor, demonstrar paciência e usar tom de voz acolhedor.

quinta-feira, 28 de setembro de 2017


How People Learn to Become Resilient


Perception is key to resilience: Do you conceptualize an event as traumatic, or as a chance to learn and grow?      Illustration by Gizem Vural

Norman Garmezy, a developmental psychologist and clinician at the University of Minnesota, met thousands of children in his four decades of research. But one boy in particular stuck with him. He was nine years old, with an alcoholic mother and an absent father. Each day, he would arrive at school with the exact same sandwich: two slices of bread with nothing in between. At home, there was no other food available, and no one to make any. Even so, Garmezy would later recall, the boy wanted to make sure that “no one would feel pity for him and no one would know the ineptitude of his mother.” Each day, without fail, he would walk in with a smile on his face and a “bread sandwich” tucked into his bag.

The boy with the bread sandwich was part of a special group of children. He belonged to a cohort of kids—the first of many—whom Garmezy would go on to identify as succeeding, even excelling, despite incredibly difficult circumstances. These were the children who exhibited a trait Garmezy would later identify as “resilience.” (He is widely credited with being the first to study the concept in an experimental setting.) Over many years, Garmezy would visit schools across the country, focussing on those in economically depressed areas, and follow a standard protocol. He would set up meetings with the principal, along with a school social worker or nurse, and pose the same question: Were there any children whose backgrounds had initially raised red flags—kids who seemed likely to become problem kids—who had instead become, surprisingly, a source of pride? “What I was saying was, ‘Can you identify stressed children who are making it here in your school?’ ” Garmezy said, in a 1999 interview. “There would be a long pause after my inquiry before the answer came. If I had said, ‘Do you have kids in this school who seem to be troubled?,’ there wouldn’t have been a moment’s delay. But to be asked about children who were adaptive and good citizens in the school and making it even though they had come out of very disturbed backgrounds—that was a new sort of inquiry. That’s the way we began.”

Resilience presents a challenge for psychologists. Whether you can be said to have it or not largely depends not on any particular psychological test but on the way your life unfolds. If you are lucky enough to never experience any sort of adversity, we won’t know how resilient you are. It’s only when you’re faced with obstacles, stress, and other environmental threats that resilience, or the lack of it, emerges: Do you succumb or do you surmount?

Environmental threats can come in various guises. Some are the result of low socioeconomic status and challenging home conditions. (Those are the threats studied in Garmezy’s work.) Often, such threats**—**parents with psychological or other problems; exposure to violence or poor treatment; being a child of problematic divorce—are chronic. Other threats are acute: experiencing or witnessing a traumatic violent encounter, for example, or being in an accident. What matters is the intensity and the duration of the stressor. In the case of acute stressors, the intensity is usually high. The stress resulting from chronic adversity, Garmezy wrote, might be lower—but it “exerts repeated and cumulative impact on resources and adaptation and persists for many months and typically considerably longer.”

Prior to Garmezy’s work on resilience, most research on trauma and negative life events had a reverse focus. Instead of looking at areas of strength, it looked at areas of vulnerability, investigating the experiences that make people susceptible to poor life outcomes (or that lead kids to be “troubled,” as Garmezy put it). Garmezy’s work opened the door to the study of protective factors: the elements of an individual’s background or personality that could enable success despite the challenges they faced. Garmezy retired from research before reaching any definitive conclusions—his career was cut short by early-onset Alzheimer’s—but his students and followers were able to identify elements that fell into two groups: individual, psychological factors and external, environmental factors, or disposition on the one hand and luck on the other.

In 1989 a developmental psychologist named Emmy Werner published the results of a thirty-two-year longitudinal project. She had followed a group of six hundred and ninety-eight children, in Kauai, Hawaii, from before birth through their third decade of life. Along the way, she’d monitored them for any exposure to stress: maternal stress in utero, poverty, problems in the family, and so on. Two-thirds of the children came from backgrounds that were, essentially, stable, successful, and happy; the other third qualified as “at risk.” Like Garmezy, she soon discovered that not all of the at-risk children reacted to stress in the same way. Two-thirds of them “developed serious learning or behavior problems by the age of ten, or had delinquency records, mental health problems, or teen-age pregnancies by the age of eighteen.” But the remaining third developed into “competent, confident, and caring young adults.” They had attained academic, domestic, and social success—and they were always ready to capitalize on new opportunities that arose.

What was it that set the resilient children apart? Because the individuals in her sample had been followed and tested consistently for three decades, Werner had a trove of data at her disposal. She found that several elements predicted resilience. Some elements had to do with luck: a resilient child might have a strong bond with a supportive caregiver, parent, teacher, or other mentor-like figure. But another, quite large set of elements was psychological, and had to do with how the children responded to the environment. From a young age, resilient children tended to “meet the world on their own terms.” They were autonomous and independent, would seek out new experiences, and had a “positive social orientation.” “Though not especially gifted, these children used whatever skills they had effectively,” Werner wrote. Perhaps most importantly, the resilient children had what psychologists call an “internal locus of control”: they believed that they, and not their circumstances, affected their achievements. The resilient children saw themselves as the orchestrators of their own fates. In fact, on a scale that measured locus of control, they scored more than two standard deviations away from the standardization group.

Werner also discovered that resilience could change over time. Some resilient children were especially unlucky: they experienced multiple strong stressors at vulnerable points and their resilience evaporated. Resilience, she explained, is like a constant calculation: Which side of the equation weighs more, the resilience or the stressors? The stressors can become so intense that resilience is overwhelmed. Most people, in short, have a breaking point. On the flip side, some people who weren’t resilient when they were little somehow learned the skills of resilience. They were able to overcome adversity later in life and went on to flourish as much as those who’d been resilient the whole way through. This, of course, raises the question of how resilience might be learned.

George Bonanno is a clinical psychologist at Columbia University’s Teachers College; he heads the Loss, Trauma, and Emotion Lab and has been studying resilience for nearly twenty-five years. Garmezy, Werner, and others have shown that some people are far better than others at dealing with adversity; Bonanno has been trying to figure out where that variation might come from. Bonanno’s theory of resilience starts with an observation: all of us possess the same fundamental stress-response system, which has evolved over millions of years and which we share with other animals. The vast majority of people are pretty good at using that system to deal with stress. When it comes to resilience, the question is: Why do some people use the system so much more frequently or effectively than others?

One of the central elements of resilience, Bonanno has found, is perception: Do you conceptualize an event as traumatic, or as an opportunity to learn and grow? “Events are not traumatic until we experience them as traumatic,” Bonanno told me, in December. “To call something a ‘traumatic event’ belies that fact.” He has coined a different term: PTE, or potentially traumatic event, which he argues is more accurate. The theory is straightforward. Every frightening event, no matter how negative it might seem from the sidelines, has the potential to be traumatic or not to the person experiencing it. (Bonanno focusses on acute negative events, where we may be seriously harmed; others who study resilience, including Garmezy and Werner, look more broadly.) Take something as terrible as the surprising death of a close friend: you might be sad, but if you can find a way to construe that event as filled with meaning—perhaps it leads to greater awareness of a certain disease, say, or to closer ties with the community**—**then it may not be seen as a trauma. (Indeed, Werner found that resilient individuals were far more likely to report having sources of spiritual and religious support than those who weren’t.) The experience isn’t inherent in the event; it resides in the event’s psychological construal.

It’s for this reason, Bonanno told me, that “stressful” or “traumatic” events in and of themselves don’t have much predictive power when it comes to life outcomes. “The prospective epidemiological data shows that exposure to potentially traumatic events does not predict later functioning,” he said. “It’s only predictive if there’s a negative response.” In other words, living through adversity, be it endemic to your environment or an acute negative event, doesn’t guarantee that you’ll suffer going forward. What matters is whether that adversity becomes traumatizing.

The good news is that positive construal can be taught. “We can make ourselves more or less vulnerable by how we think about things,” Bonanno said. In research at Columbia, the neuroscientist Kevin Ochsner has shown that teaching people to think of stimuli in different ways—to reframe them in positive terms when the initial response is negative, or in a less emotional way when the initial response is emotionally “hot”—changes how they experience and react to the stimulus. You can train people to better regulate their emotions, and the training seems to have lasting effects.

Similar work has been done with explanatory styles—the techniques we use to explain events. I’ve written before about the research of Martin Seligman, the University of Pennsylvania psychologist who pioneered much of the field of positive psychology: Seligman found that training people to change their explanatory styles from internal to external (“Bad events aren’t my fault”), from global to specific (“This is one narrow thing rather than a massive indication that something is wrong with my life”), and from permanent to impermanent (“I can change the situation, rather than assuming it’s fixed”) made them more psychologically successful and less prone to depression. The same goes for locus of control: not only is a more internal locus tied to perceiving less stress and performing better but changing your locus from external to internal leads to positive changes in both psychological well-being and objective work performance. The cognitive skills that underpin resilience, then, seem like they can indeed be learned over time, creating resilience where there was none.

Unfortunately, the opposite may also be true. “We can become less resilient, or less likely to be resilient,” Bonanno says. “We can create or exaggerate stressors very easily in our own minds. That’s the danger of the human condition.” Human beings are capable of worry and rumination: we can take a minor thing, blow it up in our heads, run through it over and over, and drive ourselves crazy until we feel like that minor thing is the biggest thing that ever happened. In a sense, it’s a self-fulfilling prophecy. Frame adversity as a challenge, and you become more flexible and able to deal with it, move on, learn from it, and grow. Focus on it, frame it as a threat, and a potentially traumatic event becomes an enduring problem; you become more inflexible, and more likely to be negatively affected.

In December the New York Times Magazine published an essay called “The Profound Emptiness of ‘Resilience.’ ” It pointed out that the word is now used everywhere, often in ways that drain it of meaning and link it to vague concepts like “character.” But resilience doesn’t have to be an empty or vague concept. In fact, decades of research have revealed a lot about how it works. This research shows that resilience is, ultimately, a set of skills that can be taught. In recent years, we’ve taken to using the term sloppily—but our sloppy usage doesn’t mean that it hasn’t been usefully and precisely defined. It’s time we invest the time and energy to understand what “resilience” really means.