You Don't Need a Weatherman
(To Know Which Way the Wind Blows)
Lately when I wake up in the morning, within minutes I’m thinking about the latest political news. Actually, thoughts of politics have taken an ever-increasing amount of my time and energy over the past ten years. Politics have gone from being, well, politics, to being something far more personal. As it turns out, they’ve always been personal; I just haven’t always paid attention to that fact.
While I was growing up in the 1960s, the U.S. military fed a steady supply of young men into the meat grinder of the Vietnam War. At age seven, I was only vaguely aware of the war, or even what war meant. In early 1963, when I was in the second grade, the father of a boy in my class was among the first American soldiers to die in Vietnam. When I saw him and his twin brother at recess every day, I felt sad for them, especially because my own father had died a few years earlier. Beyond that, however, I thought little about the war or its devastation, at least not until I grew a little older.
According to the theories of Jean Piaget, a Swiss psychologist (1896-1980), children between the ages of 7 and 11 go through a “concrete operational” stage of thinking (https://www.verywellmind.com/piagets-stages-of-cognitive-development-2795457). Piaget’s description of this stage of a child’s life rings true for me. As I coped with covert abuse in my newly blended family, I battened down the hatches of my mind, paying little attention to the world beyond my immediate realm. Concrete thinking involves a primary focus on the “how” of a situation rather than the “why.” In other words, how can I deal with what’s right in front of me, rather than seeking answers to the more abstract question of “why?” While both approaches are useful in different situations, children in the concrete stage of development haven’t yet learned how to transition easily between the two. As I struggled with my problems on my own (believing my concrete, rudimentary coping skills would have to suffice), I was relieved to leave the broader philosophical questions to my older siblings. My brother, six years my senior, became a conscientious objector of the Vietnam War; my sister, three years older than me, became a hippie in the truest sense of the word.
When John Kennedy was killed in 1963, I was in the third grade. At eight, I didn’t fully understand the tragedy or the political fallout of his death. I only knew school closed early that day and I went home, where I found my mother weeping as she stared at our black-and-white TV. At that age, kids have (usually) developed a degree of empathy; they can recognize that other people’s thoughts and feelings don’t always mirror their own; and they know enough to deal with the immediate facts in front of them. Apart from that, not much independent, abstract thinking tends to go on. In the case of my mother crying on the sofa that November day, I tried to comfort her with my puppy-dog presence and a setting aside of my own needs rather than with any eloquent words of sympathy. I was good at that—setting aside my own needs.
In April 1968, Martin Luther King, Jr. was murdered in Memphis; if his death garnered more than a few tears from the citizens of our small Southern California town, comprised mainly of Whites and Latinos, I wasn’t aware of it. Of course, as I became more educated in later years, MLK’s death did become more meaningful and significant to me. I offer that as a reminder that education—as long as it’s based on accurate facts— can serve to enhance our empathy and understanding.
It’s embarrassing, but true, that the “civil rights movement” meant little to me when I was a kid in a predominantly conservative Southern California town. There were those in our area who spoke in derogatory terms about the Latinos, while simultaneously entrusting them with all the pesky domestic jobs they didn’t want to do for themselves. In my house, the adults were split: my mother was a liberal who spoke kindly of other races and cultures; her husband was not. Years later, in college, I minored in Spanish, in part as an acknowledgement of the large number of people surrounding me who spoke Spanish as their first language.
Two months after Martin Luther King was murdered, Bobby Kennedy was also shot and killed. It happened in Los Angeles, a mere 100 miles north of my home. For the first time, I grieved a political event. My grief was nebulous, not tied to any strong political ideology of my own; I only knew Bobby seemed like “a good guy,” and I’d hoped he’d get elected and be successful at ending the Vietnam War, as he proposed to do. Spurred by the sadness in my 12-year-old heart, I saved the entire issue of the Los Angeles Times from the day of RFK’s death—June 6, 1968—as if the sincerity of my grief could compensate for my powerlessness in the face of the dismal, bloody late 1960s.
In Piaget’s theory of development, children arrive at the “formal operational stage” of thinking at age 12, approximately. In this stage, they begin to use abstract thinking skills more often, rather than seeing the world in strictly black-and-white, egocentric terms. In this stage of development, concrete thinking gives way to deductive reasoning; gradually, moral, ethical, social and political issues play a larger part in teenagers’ thoughts. Still, the transition to thinking more abstractedly, seeing the bigger picture of a situation—i.e. thinking “outside the box”—doesn’t always adhere to a strict theoretical timetable; nor is it guaranteed to happen at all. (More on that later.)
In the summer of 1968, shortly before my thirteenth birthday, a group of women gathered in Atlantic City to protest the Miss America Pageant. The protestors threw a variety of garments intended to symbolize men’s oppression of women— e.g. high heels, bras, girdles—into a bin they dubbed the “freedom trash can.” Although no bras were burned that day—or ever, for that matter—someone in the media coined the term “bra burners” to describe the women who protested that day. From then on, the demeaning catch-phrase was repeated so often that people began to believe it was a fact.
The implications of this protest—both as a ominous rejection of the values of my mother, grandmother and every female ancestor who’d preceded them, and as the dawn of a new era into which I’d soon graduate into adulthood—eluded me. For one thing, like many girls poised on the edge of adolescence, for as long as I could remember, I’d been looking forward to wearing a bra. I saw it as a privilege, a rite of passage out of childhood, a step toward womanhood. Now, from the vantage point of a woman closer to the end of my life than I am to dewy-eyed adolescence, I can appreciate the other point of view. Decades after women in the U.S. fought for the right to vote, the brave women who risked mockery and ridicule to revive the women’s rights movement in the 1960s viewed the garments they discarded as symbols of male control over women’s sexuality. (Keep it under wraps, girls, except when we’d rather you didn’t.)
In 1972, when I was a junior in high school, the historical significance of the passage of Title IX went right over my head. Title IX criminalized discrimination on the basis of gender in athletics and education in any program that received funding from the federal government; it mandated program administrators to respond to accusations of inequity. As a student in a private, all-girls Catholic school until my senior year, the fight for equity in sports and education didn’t seem to apply to me.
Later that year, the Supreme Court ruled that the right to privacy under the Fourteenth Amendment protected a woman’s right to decide whether to have a clinical abortion; the decision became law in January 1973. While I definitely knew about this event, its historical significance meant little to me at the time.
As a freshman at San Diego State University in 1973, I had no clue that SDSU had the first Women’s Studies program in the country. I also had no idea that the program had been established only three years before I arrived, or that the perseverance of a 22- year-old student, Carol Rowell Council, was responsible for the program’s existence. In her memoir, The Girl at the Fence, Council recounts the opposition she and the program’s co-founder, Joyce Nower, a young Sociology professor, faced. For example, an older professor—a woman— believed that while “pockets of ignorance” remained about women’s rights, a Women’s Studies program was unnecessary.
I knew none of this at the time; but I did enjoy the classes I took in the department, especially those that intersected with my traditional English degree, where the books I was assigned to read were written almost exclusively by men. For my Women’s Studies courses, I read novels by writers I might never have known about otherwise—among them, Alice Walker, Toni Morrison, Maya Angelou, Erica Jong, Margaret Atwood, Doris Lessing, and Joyce Carole Oates. I read nonfiction by Gloria Steinem, Germaine Greer, Betty Friedan, Susan Brownmiller and Simone de Beauvoir, among others. My professors encouraged me to consider the traditional and historical roles of women, and how those societal expectations of women impacted my mother, her mother, my sisters and me. That was the first time I heard the phrase “the personal is political.” And it began to dawn on me that the unhappy upbringing I’d endured, subjected to the misogyny and toxic sexism of my mother’s second husband, was bigger than the confines of my former home. Once I became aware that other girls and women had undergone—were undergoing—the same or similar indignities I’d suffered, it gave me hope. That may seem paradoxical, but it’s true. Yes, I’d been mistreated, but I wasn’t alone.
According to the NWHA (https://nationalwomenshistoryalliance.org), the women’s rights movement in the United States began in July 1848:
On a sweltering summer day in upstate New York, a young housewife and mother, Elizabeth Cady Stanton, was invited to tea with four women friends…A convention to discuss the social, civil, and religious condition and rights of woman [took] place at the Wesleyan Chapel in Seneca Falls on July 19 and 20, 1848. In the history of western civilization, no similar public meeting had ever been called.
Frederick Douglass, a former slave and staunch advocate for the abolition of slavery, was the only man to attend the historical meeting. He saw the parallels between the enslavement of Blacks and the plight of women in what was intended to be a democratic society.
The women’s list of grievances was long:
Married women were legally dead in the eyes of the law; women were not allowed to vote; women had to submit to laws when they had no voice in their formation; married women had no property rights; husbands had legal power over and responsibility for their wives to the extent that they could imprison or beat them [or rape them] with impunity; divorce and child custody laws favored men, giving no rights to women; women had to pay property taxes although they had no representation in the levying of these taxes; most occupations were closed to women and when women did work they were paid only a fraction of what men earned; women were not allowed to enter professions such as medicine or law; women had no means to gain an education since no college or university would accept women students; with only a few exceptions, women were not allowed to participate in the affairs of the church; women were robbed of their self-confidence and self-respect, and were made totally dependent on men.
We know from history that the road to address all of these grievances has been long and arduous. Even the fight to secure women’s right to vote took 72 years beyond the date of that meeting in Seneca Falls to succeed. More than 100 years later, Trump issued an Executive Order with the intent of creating obstacles to the right to vote. The Executive Order was overturned in litigation; however, it later became the SAVE Act, which, if passed, would require (among other things aimed at various groups of people) women to produce a paper trail of their marital name change(s) as a condition of registering to vote; an estimated 69 million women would be unable to obtain—easily or at all— the necessary documentation. As of December 2025, the bill is stalled in the Senate. In addition to impending threats to voting rights, women’s right under federal law to early-term clinical abortion was rescinded by the Supreme Court in 2022.
If we’ve been paying attention, politics have clearly always been personal. They impact all of our lives, although not in the same way across the board. In the past decade, the personal impact of politics has become increasingly obvious: an ethically and morally bankrupt* president spews vitriol towards immigrants, the disabled, military heroes, murder victims, reporters, reporters who are also women, women who threaten his masculinity, and a wide variety of women and men in other categories too numerous to list. (*But wait, you might ask, can someone be bankrupted of something they never had to begin with?)
Looking back now, letting the actual state of women’s rights—and human rights in general—in the era of my childhood percolate in my awareness, it boggles my mind. Not because things are so radically different or better now. In fact, I was inspired to write this post by the current escalation of attacks on human rights in general: women’s right to bodily integrity; immigrants’ rights to dignity; and families’ rights to the basics of affordable food, shelter and health care. As an older woman in a country whose leaders are disturbingly, even outrageously misogynistic, it’s jarring to remember those days of the 1960s and 1970s, because I really didn’t know where I stood. I naively thought my being a girl, and then a woman, was a politically neutral fact of life. It reminds me of the adage about fish: Who discovered water? We don’t know, but it probably wasn’t the fish; how could they notice what they’re swimming in when they’ve never been aware of anything else?
With all due respect to Piaget, as I contemplate the current state of political affairs in the United States, it seems clear to me that a great number of people remain concrete thinkers far beyond the age of 12. Concrete thinking limits a person’s ability to look beyond the surface of political rhetoric, and to see the bigger picture, playing out in one’s mind the long-term implications of our votes, or of our failure to exercise our right to vote. Thankfully, if we exercise what scientists call neuroplasticity (https://www.physio-pedia.com/Neuroplasticity#:~:text=) we can all improve our abilities to see beyond our own noses. I doubt a majority of people in the United States truly want the current escalation of human rights violations—whether they be towards women, children, men, immigrants, people of color, the LGBTQ community, families, the disabled, the ill, the elderly, or anyone else— to continue. To borrow a metaphor (a prime example of abstract thought) from an old Bob Dylan song, “you don’t need a weatherman to know which way the wind blows.”

