Skip to Main Content

Generation Me - Revised and Updated

Why Today's Young Americans Are More Confident, Assertive, Entitled--and More Miserable Than Ever Before

LIST PRICE $18.99

PRICE MAY VARY BY RETAILER

Free shipping when you spend $40. Terms apply.

Buy from Other Retailers

See More Retailers

About The Book

In this provocative and newly revised book, headline-making psychologist Dr. Jean Twenge explores why the young people she calls “Generation Me” are tolerant, confident, open-minded, and ambitious but also disengaged, narcissistic, distrustful, and anxious.

Born in the ’80s, and ’90s and called “The Entitlement Generation” or Millennials, they are reshaping schools, colleges, and businesses all over the country. The children of the Baby Boomers are not only feeling the effects of the recession and the changing job market—they are affecting change the world over. Now, in this new edition of Generation Me, Dr. Twenge incorporates the latest research, data, and statistics, as well as new stories and cultural references, to show how “Gen Me-ers” have shifted the American character, redefining what it means to be an individual in today’s society.

Dr. Twenge uses data from 11 million respondents to reveal shocking truths about this generation, including dramatic differences in sexual behavior and religious practice, and controversial predictions about what the future holds for them and society as a whole. Her often humorous, eyebrow-raising stories about real people vividly bring to life the hopes, disappointments, and challenges of Generation Me. Engaging, controversial, prescriptive, and funny, Generation Me gives Boomers and GenX’ers new and fascinating insights into their offspring, and helps those in their teens, twenties, and thirties find their road to happiness.

Excerpt

Generation Me - Revised and Updated 1 You Don’t Need Their Approval: The Decline of Social Rules
Getting dressed in the morning is a fundamentally different experience today than it was fifty years ago. For all of Generation Me’s lifetime, clothes have been a medium of self-expression, an individual choice in a range of alternatives and comfort. Contrast this to past decades, when men wore ties most of the time and women did not leave the house without crisp white gloves and a tight girdle. Pictures of crowds in the early 1960s show quaint sights such as men wearing three-piece suits at baseball games and ladies lined up in identical-length skirts. To GenMe, these images look like those of people on an alien planet—who wears a suit to a baseball game?

Even our shoes are different. Today’s casual footwear are called tennis shoes because people once wore them only to play tennis or basketball. Not even kids wore these types of shoes on the street—their shoes were made of stiff leather, just like adults’.

Now that’s all but forgotten. Except in the most formal of workplaces, few men wear suits to work, and virtually no one wears them to baseball games. Women have (thankfully) abandoned wearing tight girdles and white gloves everywhere they go (and many young women don’t even know what a girdle is, though some are devoted to Spanx, the GenMe version). The trend toward more informal dress has accelerated in the past ten years, with many companies opting for “business casual” and others going for just plain casual. The trend reached all the way to the top in July 2005, when about half the members of the Northwestern University women’s lacrosse team wore flip-flops during their White House visit, resulting in a picture of the president of the United States standing next to several young women wearing shoes that were once reserved for walking on sand or showering in scuzzy gymnasiums. Although most people still want to look good, we are a much more informal and accepting society than we once were. This is a perfect illustration of generational trends in attitudes, as the entire point in dressing up is to make a good impression on others and elicit their approval. You don’t dress to be relaxed, natural, and happy.


Holiday card, Minnesota, 1955. Not only are the clothes formal, but so is the posing and demeanor. The perfect family was proper and composed.


Holiday card, Massachusetts, mid-2000s. Formal clothing is no longer necessary to make a good impression. It is now more important to dress for yourself or for your comfort; if you really wanted to do things “your way” and just for yourself, you’d wear jeans to work. Many of us already do.

The strict rules of previous decades went far beyond appearance. Beneath the wool suits and tailored hats, yesterday’s men and women were bound by another type of conformity. Male or female, you were considered strange if you did not marry by age 25 and even stranger if you married outside your race or religion. It was expected that you would have children—it was not considered a choice. Your race and sex dictated your fate and behavior. When war came, you went to fight if you were male and able. Overall, duty and responsibility were held more important than individual needs and wants. You did certain things, you said certain things, and you didn’t talk about certain things. End of story.

Today, few of these rules apply. We are driven instead by our individual needs and desires. We are told to follow our dreams, to pursue happiness above all else. It’s okay to be different, and you should do what’s right for you. The phrase my needs was four times as common in American books in the 2000s compared to those in the 1960s. Young people today are only half as likely as those in the late 1980s to believe that children should learn obedience above all else. Baby boys in the 2010s (versus the 1950s) were three times less likely to receive one of the ten most popular names. These changes are not clearly good or clearly bad, but they do indicate a strong shift toward individualism.

The choices of the individual are now held so paramount that the most common advice given to teenagers is “Just be yourself.” (Not that long ago, it was more likely to be “Be polite.”) This started with Generation X: Filmmaker Kevin Smith says, “My generation believes we can do almost anything. My characters are free: no social mores keep them in check.” Or take Melissa, 20, who says, “I couldn’t care less how I am viewed by society. I live my life according to the morals, views, and standards that I create.”

This is the social trend—so strong it’s a revolution—that ties all of the generational changes together in a neat, tight bundle: do what makes you happy, and don’t worry about what other people think. It is enormously different from the cultural ethos of previous decades, and it is a philosophy that GenMe takes entirely for granted. “As long as I believe in myself, I really do not care what others think,” says Rachel, 21.

GENERATIONS AT THE CINEMA

The ethos of self-belief appears frequently in popular movies; my favorite examples involve what I call “the apparent time traveler.” The main character in these films is supposed to be a real person in the 1950s, but he or she actually represents the enlightened voice of the 21st century, which makes him (or her) the hero of the film. These movies were ubiquitous in the 2000s, when much of GenMe were forming their view of the world. In 2003’s Mona Lisa Smile, Julia Roberts plays a professor at Wellesley College in 1953. Soon after arriving, she rallies her students against the restrictions of early marriage and training for motherhood. When she critiques sexist advertising during a class, the modern audience knows exactly what she is doing, but few people in the 1950s would have seen it before—or even thought to do it. Roberts’s character has clearly taken the time-traveler shuttle to the future and absconded with a copy of the 1987 feminist antiadvertising film Still Killing Us Softly.

The Majestic, released in 2001, is an even worse movie. Jim Carrey’s character, a Hollywood screenwriter, gets blacklisted and takes refuge in a small town. After he is asked to testify, he convinces the entire town that McCarthyism is bad and that free speech is our most treasured right. The whole town unites behind the accused writer, and the main female character says, “It doesn’t really matter if you are a Communist or not—this is America and you can be one if you want to. It’s nobody’s business.” Uh, not really. Had this actually been the 1950s, an accused Communist would have been everybody’s business. This viewpoint was common even in the 1970s, when 48% of Americans believed Communists should not be allowed to give a speech, teach at a college, or have a book in a local library.

Movies that admit to time travel are somewhat more enjoyable. In Pleasantville, two modern teenagers help a 1950s town find passion and the freedom of ideas. Every character who discovers an individualistic freedom such as sex or intellectual questioning instantly turns from black and white into color. The film sinks into predictability once discrimination against the “colored” people begins. (Get it?)

Other movies travel across cultures rather than time, but they promote the same message. In 2002’s Bend It Like Beckham, an Indian girl living in London wants to play soccer. Her parents, already taken aback that their older daughter did not have an arranged marriage, want Jess to learn to cook and be a proper young lady. The plot comes to a head when Jess must shuttle back and forth between a game and her sister’s wedding. By the end of the movie, Jess wants to join a professional women’s soccer team and move to America. Her parents, finally convinced that it’s right for Jess to follow her dreams, reluctantly agree. The overall message of all of these movies—whether they travel in time or cultures—is to rebel against restrictive social mores. Don’t follow the rules; do whatever makes you happy.

And sometimes you don’t even need to travel. The biggest box-office draw in late 2004 and early 2005 was Meet the Fockers, the sequel to the highly successful comedy Meet the Parents. The movie revolves around the culture clash between the conservative Byrnes family and the hippie Focker family. The Fockers provide most of the comedy in the film, with their sex-therapy business, their leather sandals, and their display of their son’s ninth-place ribbons (because, they say, “It’s not about winning—it’s about what’s in your heart”). But by the end of the movie, the Fockers are not the ones who have been convinced to change—it’s the straitlaced Byrnes family who learns from them. Mr. Byrnes, played to crusty perfection by Robert De Niro, learns to loosen up and show emotion toward his daughter. He also decides that it might be good for him and his wife to enjoy more physical affection in their marriage, and he puts some of Mrs. Focker’s sex tips to good use. Hippies may be laughable, but they teach us how to live. No need to walk around all uptight like that—which you must be if you’re not a hippie. I’m exaggerating a bit, but the movie does make it clear which life philosophy is correct, and it’s definitely Let It All Hang Out.

These movies dramatize two interlocking changes: the fall of social rules and the rise of the individual. As the individualistic viewpoint became prominent, concern with the opinions of others plummeted. This chapter discusses the decline in the need for social approval, and the following two chapters document the ascendance of the individual self. Over the last few decades, the entire nation has experienced the transformation parodied in an episode of The Simpsons, when Springfield’s usual Do What We Say Festival (started, they say, in 1946 by German settlers) is replaced with the new Do What You Feel Festival.

DO YOUR OWN THING

Imagine you are seated at a table with six other people. Four lines are drawn on a chalkboard at the front of the room: a medium-length target line; along with line A, medium; line B, short; and line C, long. You’re to say which of the lines is the same length as the target. You’re all ready with the obvious answer of A, but the six others go first and say line C. What do you do?

When Solomon Asch first performed this experiment in 1951, 74% of people gave the group’s incorrect answer on at least one trial, and 28% did on the majority of trials. People felt the need to conform to the group and not to stand out. The study became one of the most famous in social psychology, taught in every class as an example of the social nature of human beings. Yet some have pointed out that this was the essence of getting along in 1950s society, when no one wanted to be thought of as different. But when researchers tried to replicate the study in 1980, they got completely different results: few people conformed to the group anymore. Apparently, it was no longer fashionable to go along with the group even when they were wrong. The authors of the study concluded that the Asch study was “a child of its time.” A similar thing happened when a psychologist tried to replicate the Milgram study, an early 1960s study finding that people would shock someone else at dangerous levels when told to do so by an authority figure. In 2009, nearly twice as many men refused to obey the experimenter’s orders.

Throughout the 1970s, self-help books and therapists actively encouraged people to flout social rules, telling readers they should stop caring about what others think. A central chapter in the 1976 megabestseller Your Erroneous Zones, by Wayne Dyer, is called “You Don’t Need Their Approval.” The author argues that people can do anything they put their minds to, and that others’ opinions only get in the way. (It’s probably no coincidence that both the cover and back of the book feature oversize pictures of the author, complete with a 1970s, powder-blue, V-neck shirt and the resulting display of male chest hair.) Dyer rants on and on about how courteous acts such as giving a wedding gift or attending a funeral are “musterbation,” his double-entendre term for unnecessary social rules. Dyer argues that seeking approval from parents, teachers, and bosses undermines self-reliance and truth. “Needing approval is tantamount to saying ‘Your view of me is more important than my own opinion of myself,’?” he writes. Another self-help book carries on the tradition with the title What You Think of Me Is None of My Business. Unlike the Baby Boomers, who learned these new standards as adults, GenMe takes these attitudes for granted and always has.

“Just be yourself” is the central ethos of modern parenting. In 1924, a group of sociologists did an extensive study of the citizens of a place they called Middletown (later revealed as Muncie, Indiana). When mothers were asked which traits they wanted their children to have, they named strict obedience, loyalty to church, and good manners. In 1988, when the first wave of GenMe were young children, few mothers named these traits; instead, they chose independence and tolerance. Modern mothers might be gratified to learn that these values sank in. In Growing Up Digital, an 11-year-old girl says, “I think the individual determines what is cool, and it is his or her opinion. What is cool to one person might not be to another. The days of conformity are over.” Danielle, 29, agrees: “I refuse to do something because it’s what everyone else is doing, or because it’s the socially acceptable thing to do at the time.” When I asked my undergraduate students to name the characteristics that best described their generation, the two most popular answers were “independent” and “open-minded.”

GenMe has been taught these values since birth—beginning with the unique names bestowed upon them. Like just about everyone else, I’d noticed that baby names seemed to be getting stranger every year. When my husband and I were naming our first child in 2006, I discovered the Social Security Administration’s database of 325 million Americans’ names going back to the 1880s. So I had to see if there was a generational change. Sure enough, the parents of GenMe’ers, and GenMe’ers themselves, were more likely than those in previous eras to give their children unique names (so they could stand out) instead of common names (so they could fit in). In 1950, 1 out of 3 boys received one of the top 10 names. By 2012, less than 1 out of 10 did. Girls receiving a common name dropped from 1 out of 4 to less than 1 out of 10. (We also controlled the analyses for immigration and looked within states with low Latino populations, such as North Dakota and Mississippi, to make sure that ethnic changes didn’t account for the effects, and they did not.) By 2012, new parents—the majority of whom were GenMe—took things a step further to proclaim their child’s greatness. The boys’ names that increased the most in popularity between 2011 and 2012 included Major, King, and Messiah. Somewhat high expectations to put on a newborn.

As Jaden, 25, puts it, “For my grandparents, questioning their religion, their country’s system of government, or what they ate was not acceptable. The fear of standing out or being judged by others for their beliefs was strong. My generation is much more independent. I pride myself on being a free and independent thinker. My wish is to break down the walls that humans have socially constructed.” A book on generations in the workplace notes that today’s young people were instructed to “Never just do what an adult asks. Always ask, ‘Why?’?” Some people say this should be the label for the generation—not Generation Y, but Generation Why?

At times, this attitude can lead to the more questionable idea that there are no rules, so you might as well make up your own. In interviews of 18-to-23-year-olds conducted in 2008 for his books Souls in Transition and Lost in Transition, Christian Smith found that most young Americans espouse “moral individualism,” believing that morality is a personal choice. “I have no other way of knowing what to do morally but how I internally feel. That’s where my decisions come from. From me, from inside of me,” said one. So should people follow rules for what the society says is right or wrong? the researchers asked. “I think it’s your personal belief system,” said another young person. “I don’t think it’s anything like social norms or like that. I think it’s just . . . dependent on each person and their own beliefs and what they think is right or wrong.”

Thus, it follows that everyone has his or her own individual moral views, and it’s not right to question someone else’s view. “I guess what makes something right is how I feel about it, but different people feel different ways, so I couldn’t speak on behalf of anyone else as to what’s right and what’s wrong,” said one young man.

This moral individualism can easily become, as Smith puts it, a “live and let die” philosophy. When asked if people have any moral responsibility or duty to help others, one young person replied, “No, not really.” Would it be a problem if someone didn’t want to help others? asked the interviewer. “No. . . . They can help themselves. . . . Do they really need anyone else?” he replied. “So if someone asks for help, we don’t have an obligation to them?” prodded the interviewer. “Yeah, it’s up to each individual, of course,” the young adult asserted.

Smith concludes that most emerging adults seem unaware of any source of moral reasoning outside of themselves. “Instead . . . the world consists of so many individuals, and each individual decides for themselves what is and isn’t moral and immoral,” Smith writes. “Morality is ultimately a matter of personal opinion. Everyone should tolerate everyone else, take care of their own business, and hopefully get along.” This is the razor’s edge of modern individualism: tolerance is great, but perhaps not when each individual is free to decide for himself which rules to follow, and helping others is rarely one of those rules.

What about all of the GenMe’ers who are serving in the military, and who served in Afghanistan and Iraq when we were all sitting safe at home? Military service can certainly be an example of self-sacrifice, duty, and collectivism. However, the data suggest that GenMe service members are the exception, not the rule. According to the Pew Center, only 2% of GenMe has served in the military, compared to 6% of GenX and 13% of Boomers. Polls of 16-to-24-year-olds conducted by the Department of Defense show that fewer now say they are likely to join the military: 18% expressed interest in 2010, down from 26% in 1986. This is partially because many more young people automatically rule out military service. In a nationally representative sample of high school students, 2 out of 3 (67%) said they “definitely won’t” join the military in 2012, up from 57% in 1976. This does not diminish the contributions of the GenMe’ers who do serve, but it contrasts them with the majority of their generation.

One upside to the individualistic attitude is lessened prejudice and discrimination. Amanda, 22, says that one of the main lessons in her Girl Scout troop was “being different is good.” It’s a mantra GenMe has heard over and over. They absorbed the lesson of tolerance with their baby food—not just for race and religion, but for sexual orientation. It also extends to beliefs, feelings, and all kinds of other intangibles. Just about the only difference that wasn’t good? Someone who was prejudiced.

That’s exactly what appears in our recent analysis of data from the nationally representative General Social Survey. Boomers set in motion strong trends toward tolerance of groups such as Communists, gays and lesbians, and those who oppose religion. Generation Me continued those trends throughout the 2000s and 2010s, but diverged from Boomers in one major way: they were less tolerant than Boomers toward someone who claimed that blacks are genetically inferior. GenMe is thus the most tolerant generation in American history—the only group they will not tolerate are those who are intolerant themselves.

WHO CARES WHAT YOU THINK?

Not caring what others think may also explain the apparent decline in manners and politeness. GenMe’ers do not believe there is one right way of doing things, and most were never taught the rules of etiquette. When that means wearing white shoes after Labor Day and using whatever fork you want, no problem. But most etiquette was developed to provide something often lacking in modern life: respect for other people’s comfort. “Society has gotten increasingly callous and me-centered, and we’re fed up with [the results],” says Corinne Gregory, founder of a class called the PoliteChild. A high school teacher told me that she noticed her students don’t “clean up nice”—they find it difficult not to swear and to speak more formally when necessary. They talk to older people and authority figures the same way they talk to their friends. A business book relates the story of a company founder who visited one of his shops and asked a young employee how she was doing. “Well, a little hungover this morning, but okay,” she replied.

A recent article related numerous stories of young job applicants’ lack of perspective, from answering their cell phones during the interview to bringing their parents. Jaime Fall, vice president of the HR Policy Association, says GenMe’s mind-set is “You’re perfect just the way you are—do whatever you’re comfortable doing”—an attitude that can backfire in interviews. “Life has gotten more casual,” observes Mara Swan, executive vice president at Manpower. “They don’t realize [the interview] is a sales event.”

It goes beyond manners—people today are less likely to follow all kinds of social rules. Business professor John Trinkaus finds that fewer people now slow down in a school zone, and fewer observe the item limit in a supermarket express lane. More people cut across parking lots to bypass stoplights. In 1979, 29% of people failed to stop at a particular stop sign in a New York suburb, but by 1996 a stunning 97% of drivers did not stop at all. In Trinkaus’s most ironic finding, the number of people who paid the suggested fee for lighting a candle at a Catholic church decreased from 92% to 25% between the late 1990s and 2006. In other words, 75% of people cheated the church out of money in the most recent observation.

Cheating is also rampant among students. A 2008 study found that 95% of high school students said they had cheated. That included 64% who have cheated on a test by copying from someone else or using crib notes. The rest merely told classmates what would be on a test, but, according to researcher Donald McCabe, most students don’t even count that as cheating. Another survey found that 34% of high school students admitted to cheating on an exam in 1969, which rose to 61% in 1992 (GenX) and to an incredible 74% in 2002 (the first wave of GenMe). Fortunately, fewer in the second wave of GenMe, 51%, reported cheating on an exam in 2012. Of course, that’s still the majority and may underestimate the actual number. High levels of cheating continue into college; a 2002 survey found that 80% of students at Texas A&M University admitted to cheating; a 2007 poll of students at 12 different colleges found that 67% admitted to cheating.

Although competition for grades may have fueled the increase, attitudes have shifted along with the behavior. In a 2012 study of 25,000 high school students, 57% agreed that “in the real world, successful people do what they have to do to win, even if others consider it cheating.” In other words, the majority believed that the ends justified the means. McCabe has found this attitude especially prevalent in business schools, characterized by a “get-it-done, damn-the-torpedoes, succeed-at-all-costs mentality.” “According to my research,” McCabe wrote in Harvard Business Review, “the mind-set of most MBAs—bottom line—is to get the highest GPA possible, regardless of the means. After all, the students with the highest GPAs get the best shot at the six-figure jobs.”

This breakdown in consideration and loyalty, and the increase in cheating, reaches all the way to the top. Business scandals, such as those at WorldCom and Enron, demonstrated that many people have little problem with breaking rules and telling lies in an attempt to make more money. The mortgage meltdown of the late 2000s was a quite spectacular example of this as well, with banks continuing to get rich as ordinary Americans lost jobs and had their homes foreclosed. In psychology and medicine, several researchers were recently shown to have published dozens of papers based on fraudulent data. Even honest businesses disregard other time-tested social rules, such as loyalty to employees. Companies are now more likely to raid pension funds and engage in mass layoffs to prop up a sinking stock price. Others ship jobs overseas if it will save money. “Downsizing” and “outsourcing” are the modern corporate equivalents of rudeness—and a lot more devastating. Because GenMe’ers grew up with this kind of ruthlessness, it should not surprise us that they think little of some occasional homework copying. It also suggests that the corporations of the future are going to need much stricter oversight to make sure that cheating and scams are kept to a minimum. Cheating on tests easily translates to cheating on the balance sheet. Expect to see more laws like Sarbanes-Oxley that ask corporations to prove that they are not cheating their stockholders. Even with these laws, more stock reports, research, and articles will have to be taken with a grain of salt—in an increasingly competitive world, the temptation to cheat will be ever stronger for GenMe.

CALL ME BETH

Boomers laid claim to the phrase question authority during the 1960s.

But GenMe doesn’t just question authority—they disregard it entirely.

“Older generations trusted God, the church, government, and elders,” says Kevin, 22. “I have questioned things and people that earlier generations never would have thought to.” This is the eventual outcome of increased informality and the loosening of social rules, and many people would rightly argue that questioning things is good. Sometimes “traditions” are outmoded and need challenging.

But sometimes GenMe takes the questioning of authority a little too far. Education professor Maureen Stout tells the story of a young man in her class who did not turn in his research paper. “After a lot of excuses and arguments he finally came out with it,” Stout writes. “He believed he was entitled to do just as he pleased and refused to recognize my authority, as the instructor, to determine what the assignments in the class should be. It was as simple as that.” Former journalist Peter Sacks related his frustration with the community college students he taught in his second career, observing that they seemed uncomfortable with “the idea that my knowledge and skills were important or even relevant.” Student after student balked when he corrected their essays, several complaining that his comments were “just your opinion.”

I recognized this phrase immediately, as I’d heard it over and over from my own students. I heard this complaint even when I corrected obvious errors such as run-on sentences and incorrect punctuation, things that were clearly not a matter of opinion. Even multiple-choice tests weren’t free from this kind of challenge. In one class, I decided it might be a good idea to review the correct answers to exam questions—it would be a way to correct misconceptions and help the students learn, I thought. Almost immediately, several students began to argue with me about the questions, claiming that the answers they had chosen were right. Since there wasn’t a grading mistake, I was forced to explain again why the answers were correct, but they continued to argue. It was the worst class I’d ever had. After it was over, an older student—who had not been one of the arguers—came up to me and said with disbelief, “Twenty years ago when I got my first degree, we never questioned teachers like that.”

Apparently I was not alone. In a recent survey of college faculty, 61% reported that students had “verbally disrespected you or challenged your authority during class.” Sixty-five percent said a student had “continuously rolled his/her eyes, frowned, or otherwise showed disdain while you were teaching.” Many students have found their cell phones ringing during class—an honest mistake—but the new twist is to answer and conduct a conversation. Sixty-one percent of professors said they’d experienced this. They were the lucky ones: 24% said they had received “hostile or threatening communications (e-mails, letters, phone messages) from a student,” and 29% said “a student yelled or screamed” at them.

New teaching philosophies sometimes explicitly acknowledge faculty’s lack of authority. When Sacks, the community college professor, complained to a colleague about the lack of respect he experienced, she advised him to adopt the more informal approach that she used. In her first class, she always announced, “I have some expertise and you have some expertise. My job is to facilitate this process. And please call me Beth.”

The message: We are all equals here. I might have more education and years of work experience, but that doesn’t mean I know any more than you. This is a lot of the reason for the crumbling of authority and the new acceptance of questioning those in charge. This can have benefits for the free exchange of ideas and engaged student learning, but clearly has downsides as well. This new democracy in education and the workplace has been energized by the new informality in dress and names. While the boss was once “Mr. Smith” or “Mrs. Jones,” bosses are now “Mike” or “Linda.” Mr. and Mrs. sound too stiff and formal—and old-fashioned. When we’re all on a first-name basis, the specter of authority takes yet another step back into the shadows of a previous era. That can bring us closer, but it can also set the stage for more disrespect and conflict.

The curriculum reflects this lack of a central authority as well. It is no longer enough to teach only the “classics”; these are now known as DWMs (Dead White Males). Few academics still agree that there is a “canon” of Western literature that all students should learn. Instead, students must take classes teaching a variety of perspectives, in which the works of women and minorities are also covered. Whether you agree or disagree with this “multicultural” approach to education, it’s clear that we no longer answer to one definite authority. There are many opinions, and each is considered valuable. Though this has many advantages, it does mean that people will be much less likely to conform to societal rules—after all, which rules would they follow? Which culture or society is “right”? GenMe is taught that none of them is, or all of them are.

Unless it’s the Internet. Like most people old enough to remember a pre-Internet world, I marvel that we ever got along without it. (How did we find movie showtimes in the early 1990s? Oh, yeah, that weird recording where a teenager with acting aspirations would read off the movies and times.) As fantastic as the Internet is for research, it also democratizes the sources of information. Suddenly, you don’t have to write a textbook or have a column in a major newspaper for thousands of people to read your words—just put up a Web page or a blog, and eventually someone, and maybe even lots of people, will stumble across it. In this environment, there is no authority: information is free, diffuse, and comes from everyone. (Whether it is correct is another matter.) In many Internet situations, you can abandon social roles entirely. Want to be a different age or sex? Go ahead. As a famous New Yorker cartoon showing two dogs in conversation puts it, “On the Internet, nobody knows you’re a dog.”

Parental authority also isn’t what it used to be. “Parents are no longer eager to be ‘parents.’ They want to love and guide their children as a trusted friend,” says family studies professor Robert Billingham. Chicago-area parent Richard Shields says that his 17-year-old son is his best friend. He prefers them to have fun together rather than impose strict rules or discipline. “It’s better for them to see our values and decide to gain them for themselves,” he says.

This also means that children play a much larger role in family decisions. The kids who chose their own outfits as preschoolers have grown into teenagers who help their parents choose which car to buy or even where to live. A Chicago Sun-Times article interviewed a large group of teens and their families, finding one where a teenage daughter helped her father decide on a new job, and another where the two teenage kids make all of the home-decorating and electronics-purchasing decisions. Forty percent of teens see their opinions as “very important” in making family decisions. In an earlier era of greater parental authority, that percentage would have been close to zero. One family’s two daughters convinced their parents to buy a second car. “I always stress to my girls to be opinionated,” said Christine Zapata, the girls’ mother. “I guess that sort of backfires on me sometimes.”

I wonder what will happen when this generation have their own children. Will they continue the move toward lesser parental authority or insist that they retain the authority they have grown accustomed to? If GenMe teaches our own children to be individualistic as well, we may have a full-scale battle of the wills once our kids become teenagers themselves.

BEING DIFFERENT IS GOOD, EVEN WHEN YOU’RE GETTING MARRIED

As one of society’s most long-lived traditions, marriage and weddings illustrate the move away from social rules better than anything else. In 1957, 80% of people said that those who didn’t marry were “sick, neurotic, or immoral.” Now, when and whether you marry is considered a personal choice. Many do not: in 2012, 41% of babies in the United States were born to unmarried women—compared to 5% in 1960. Among women under 30 who gave birth in 2009, the majority were unmarried. The social rule that you should be married before you have a baby has all but fallen by the wayside. Many of these couples live together, but 39% of cohabitating couples break up within the first five years of a child’s life (compared to 13% of those who are married). Art has imitated life, with single mothers portrayed more often on TV, with reactions shifting from outrage in the 1980s (Murphy Brown) to barely a peep in the 2000s (Friends). By 2013, the sitcom The New Normal portrayed a gay couple having a child using a surrogate—just as several gay celebrities (such as Elton John and Neil Patrick Harris) have done. Overall, this generation is much more likely to accept that there are many ways to make a family. When nearly half of babies are born to single parents, who has time to criticize them all?

Whom you marry is also much more up to the individual. My parents, a Catholic and a Lutheran (though both white and alike in every other way), had a “mixed marriage” when they wed in 1967. People in my mother’s Minnesota hometown whispered about it behind cupped hands for weeks. Now this religious difference would be considered too minor to even be discussed.

Interracial marriage has become much more common, more than doubling since 1980 and accounting for more than 1 in 7 US marriages in 2010. Yet until the Supreme Court struck down miscegenation laws in 1967, whites and blacks could not legally marry each other in sixteen states. The last antimiscegenation law was not officially repealed until November 2000, in Alabama. Now these unions are everywhere, and between almost all ethnicities and races. My next-door neighbors for three years were a Mexican American man and his half-Jewish, half-Italian wife, and I’ve lost count of the number of Asian-white marriages among people I know. Almost half of Asian women will marry a white man. In 2012, 86% of Americans—including 93% of GenMe—agreed “I think it’s all right for blacks and whites to date each other,” up from 48% in 1987. Sixty percent of twentysomethings said they had dated someone from a different racial or ethnic background. In 2009, only 36% of Boomers said that more people of different races marrying one another has been a change for the better, compared to 60% of GenMe. Asked if they would be comfortable with someone in their family marrying someone of a different race, 55% of Boomers said yes, compared to 85% of GenMe.

Many young people I’ve talked to mention interracial dating as the biggest difference between them and their parents: many of their peers date across racial lines, but their parents don’t agree with this. Several young women from Texas and North Carolina told me that if they dated a black man, their fathers would meet the poor guy at the door with a shotgun. Yet most of GenMe finds this perplexing: Who cares what race someone is? In one survey, only 10% of white young people said that marrying someone from their own ethnic group was important; however, 45% said it was important to their parents. Of young Asian Americans, 32% said same-ethnic-group marriage was important to them, but 68% said it was important to their parents. As YouTube star Kevin Wu, 19, said, “My parents like to constantly remind me that when I grow up I have to marry an Asian wife. Which is okay, I like Asian women, but I don’t like narrowing my options. Girls are like a bag of M&Ms—they’re all different colors on the outside, but on the inside, they’re all the same, and they all taste good.” He continues, “No one opens a bag of M&Ms and goes, hey, I’m only eating the yellow ones. You know why? Because that’s racist. . . . I think the only way we can stop racism is to have more interracial babies.” Interracial marriage is likely to become even more common in the future as more and more young people meet and date people from different backgrounds.

When we marry our other-race, other-religion, and possibly same-sex partners, we don’t follow all of the wedding rules of previous generations. In the mid-1960s, Brides magazine insisted that “the only correct colors” for wedding invitations “are white, ivory, or cream, with absolutely no decorations such as borders, flower sprays, and so on.” In other words, your invitation had to look just like everyone else’s. Now people use wedding invitations in every possible theme and color—and wording. When my parents lived near Dallas, they received a wedding invitation with a picture of a cowboy and cowgirl inviting guests to “c’mon over for a big weddin’ to-do.” The reply-card choices were “Yes, we’ll be there with our boots on” and “Shucks, we can’t make it.”

People are bending tradition in other ways. Some brides with male friends have a man of honor, and some grooms have best women. Another trend encourages brides to let each bridesmaid choose the style of her gown—it’s no longer required that they all wear the same dress, a rule now seen as overly conformist. Many couples write their own vows, wanting a ceremony personalized to speak for their individual love. The new trend in wedding photography is “journalistic” style: the photographers capture moments as they happen, putting less emphasis on formal posing. Weddings aren’t about rules anymore, but about individual expression. Wedding gown designer Reem Acra says a bride should choose the look that encapsulates her personality. She says, “I always ask my brides, ‘Who are you and what do you want to tell everybody?’?”


Weddings, once governed by strict conventions for dress and behavior, now have few rules. It’s your wedding, so you can wear shorts or a bikini if you want to.

THE CHURCH AND COMMUNITY OF THE INDIVIDUAL

What does the move away from social rules mean for religion? In Millennials Rising, Neil Howe and William Strauss predicted that those born in the 1980s and 1990s would be more committed to religion than previous generations, part of Millennials returning America to duty, communalism, and rule-following. They cited the growing popularity of high school prayer circles and quoted a youth minister who observed that this group liked “old-fashioned” religion. Others have argued that GenMe has moved away from religious institutions, but still maintains a private religiosity and spirituality.

I wanted to find out whether this was true, so my coauthors and I examined six nationally representative surveys of Americans collected over time. Our analysis included data on 8th- and 10th-graders—groups too young to be included in previous studies of trends in religion—and a sample of entering college students going back to 1966 (the American Freshman Survey) that at 10 million is the largest ongoing US survey with questions on religion. We also analyzed a survey of 12th-graders and two surveys of adults.

These massive datasets, with respondents aged 13 to 98, conveyed a clear conclusion: Americans’ religious commitment has declined precipitously, especially since 2000. Most Americans still affiliate with a religion, but the number who do not is growing so quickly that they may soon be the majority. The number of entering college students who named “none” as their religious affiliation tripled between 1983 and 2012 (from 8% to 24%) and doubled among 12th-graders (from 10% in the late 1970s to 20% in the 2010s). The number who say they “never” attend religious services doubled between the 1970s and the 2010s. More young teens are also growing up without religion in their lives: 38% more 8th-graders in the 2010s (compared to the early 1990s) claim no religious affiliation and never attend religious services. Assuming that earlier generations were just as religious as those in the 1970s, Generation Me is the least religious generation in American history.

Two mechanisms seem to be at work. First, more teens are being raised by nonreligious parents. For example, four times as many college students in the 2010s (versus the early 1970s) said their mother did not affiliate with a religion. Second, young people are leaving religion as they grow into young adulthood, and this tendency grew stronger over the generations.

In Soul Searching, his extensive survey of teens, Christian Smith found that intellectual skepticism was the main reason teens moved away from religion: “It didn’t make any sense anymore,” said one. “Too many questions that can’t be answered,” said another. Others could not say exactly why they became less religious or said they simply lost interest: “It never seemed that interesting to me” or “It got kind of boring.”

The flight from religion only accelerates during and after college. In 2012, 30% of Americans 18 to 29 claimed no religious affiliation—three times as many as in 1972. Only 14% of 18-to-29-year-olds attend religious services every week. Among high school seniors, most of whom still live with their parents, the figure is only 30%, down from 40% in 1976. GenMe’ers religious participation is still low even after they have their own children—nearly four times as many Americans in their 30s claimed no religious affiliation in 2012 compared to 1972.

Young people have also lost faith in religious institutions. In the late 1970s, 62% of Boomer 12th-graders thought that churches and religious organizations were “doing a good job.” That slid to 53% in the 2010s. One out of three Boomer students in the late 1970s had already donated money to a religious organization; that number was nearly cut in half by 2012 (to 17%). GenMe’ers increasingly see religion negatively, so it makes sense that they are less willing to give their money to support it. In some cases, it’s because GenMe’s fundamental belief in equality (which we’ll cover in chapter 7) and free sexuality (chapter 6) is at odds with the teachings of many religions. “Starting in middle school we got the lessons about why premarital sex was not okay, why active homosexuality was not okay, and growing up in American culture, kids automatically pushed back on those things,” said Melissa Adelman, 30, in an interview with National Public Radio. “A large part of the reason I moved away from Catholicism was because without accepting a lot of these core beliefs, I just didn’t think that I could still be part of that community.” Even the pope has shown he understands this new reality; in 2013, Pope Francis said of issues such as gay marriage and birth control, “It is not necessary to talk about these issues all the time. We have to find new balance.”

Some have argued that more Americans have moved away from religious institutions but are still privately religious or spiritual. That does not appear to be the case. The most stunning statistic comes from the nationally representative General Social Survey. Among those 18 to 29 years old in 1994 (GenX), only 2% never prayed. By 2012, 26% of 18-to-29-year-olds (GenMe) never prayed. Even when the question was asked as “prayer/meditation,” recent numbers were lower: 37% of college students never prayed or meditated in 2005, compared to 33% just nine years earlier in 1996. The number of high school students who said that religion is “not important” in their lives increased 56% (from 14% in 1976 to 22% in 2012). Belief in God has also taken a hit. In 1994, 56% of 18-to-29-year-olds said they were sure that God exists; by 2012 that had shrunk to 44%. Between GenX and GenMe, belief in God went from winning the election to losing it. So it’s not just that GenMe has moved away from religious institutions; they are also moving away from private religious belief and practice.

The idea that religion is being replaced by spirituality also doesn’t hold up. The percentage of college students who described themselves as above average in spirituality declined from 59% in 1997 to 36% in 2012. When Christian Smith asked teens about spirituality in 2008, most did not even know what the term meant. (“What do you mean, ‘spiritual seeking’?” many asked.) Smith concluded that few American teens are spiritual but not religious.

The decline in religious commitment and belief is one of the few generational trends that differs significantly by race and social class. At least among high school students, the decline has hardly touched black GenMe—only slightly fewer attend church and profess a religious affiliation now compared to the 1970s. So it’s mostly the white kids who are singing “losing my religion.” But—perhaps surprisingly—it’s not the rich white kids. In the high school sample, religious participation declined the most among working-class youth whose fathers did not attend college.

Even among the majority of young people who affiliate with a religion, their beliefs are often rooted in what Christian Smith labels “therapeutic individualism.” Within this system, “spirituality is re-narrated . . . as personal integration, subjective feeling, and self-improvement toward individual health and personal well-being—and no longer has anything to do with, for example, religious faith and self-discipline toward holiness or obedience.” Even teens who identified as religious were, as Smith put it, “incredibly inarticulate about their faith, their religious beliefs and practices, and its meaning or place in their lives.” Many could not say why they affiliated with a religion; if they did, they said that they prayed for things they wanted. For many GenMe teens, he observes, “God is treated as something like a cosmic therapist or counselor, a ready and competent helper who responds in times of trouble but who does not particularly ask for devotion or obedience.” As a 14-year-old Catholic boy from Ohio put it, “Faith is very important, I pray to God to help me with sports and school and stuff and he hasn’t let me down yet, so I think it helps you.”

In Emerging Adulthood, Jeffrey Arnett describes the belief systems of young people as “highly individualized,” which he calls “make-your-own religions.” Many don’t adhere to a specific belief system because, as one said, “I believe that whatever you feel, it’s personal. . . . Everybody has their own idea of God and what God is. . . . You have your own personal beliefs of how you feel about it and what’s acceptable for you and what’s right for you personally.” When Smith asked one young woman how she decided which interpretation of Scripture was correct, she simply said, “My own.” A young man said he evaluated different religious claims with “pretty much just my authority.” These beliefs seem to be growing: in a 2013 poll, 3 out of 4 American Catholics said they were more likely to “follow my own conscience” on difficult moral questions rather than follow “the teachings of the Pope.”

Many young people abandon organized religion due to its restrictive rules. “Saturday nights I go out and hang out, and I don’t have to necessarily worry about getting up to go to church in the morning. It’s just a lot easier, I think, to leave certain things out,” one teen said in Souls in Transition. Interviewed in Emerging Adulthood, Dana said she attended Jewish services growing up, but stopped going when she got older because “there was this pressure from the people at the synagogue to be, like, kosher, and I just didn’t like having anyone telling me what my lifestyle should be.” Beth was raised Catholic but by adulthood came to believe that humans all have natural, animalistic urges; she stopped believing because feeling guilty “made me unhappy.” Charles grew up Episcopalian but stopped attending services because “I realized I was not being encouraged to think for myself. . . . It is, literally, ‘This is black. This is white. Do this. Don’t do that.’ And I can’t hang with that.”

Many churches with growing memberships are fundamentalist Christian denominations that do require more strict adherence. However, these churches promote a personalized form of religion. Many fundamentalist Christian faiths ask adherents to believe that “Jesus Christ is your personal savior” and that “He has a plan for your life.” Many speak about having “a personal relationship with God.” Rick Warren, author of the popular Christian book The Purpose Driven Life, writes, “Accept yourself. Don’t chase after other people’s approval. . . . God accepts us unconditionally, and in His view we are all precious and priceless.” These denominations teach that one’s personal faith guarantees acceptance into heaven, not the good works you perform and the way you treat others (which traditionally defined a proper spiritual outlook and its rewards). Even if you are a murderer, you will be saved if you accept Jesus as your personal savior. Most adherents strive to live good lives, but personal beliefs are considered more important.

Churches are not the only group hurting for members. As Robert Putnam documents in Bowling Alone, memberships in community groups have declined by more than one-fourth since the 1970s. Groups such as the Elks, the Jaycees, and the PTA have all seen memberships fall. Putnam labels the trend “civic disengagement” and concludes that it is linked to generational shifts. The title of his book comes from the observation that people used to bowl in organized leagues but now bowl alone or in informal groups. Young people would rather do their own thing than join a group. Across the board, youth are now less likely to approve of or be interested in large institutions such as government, mass media, and religious organizations. In 1976, 36% of high school seniors said they had already or would probably write to a public official, but by 2012 that had sunk to 19%. Twice as many (41% vs. 20%) said they probably won’t. And this is in a time when it’s easy to fill out an online form on a government website—yet fewer young people are interested in contacting public officials than in the days when that meant looking up an address at the library, typing a letter, addressing it, putting a stamp on it, and mailing it.

GenMe’ers are also less likely to trust their neighbors, and less likely to believe that the world is a welcoming place. In 1976, 46% of high school students said that “most people can be trusted” (versus “you can’t be too careful in dealing with people”). By 2012, just 16% of teens said they trusted others. In 2012, 47% of high school students said that most people are “just looking out for themselves” rather than “try[ing] to be helpful,” and 49% said that most people “would try to take advantage of you if they got a chance.” These were all-time or near-all-time lows in the 36-year history of the survey. GenMe trusts no one, suggesting a culture growing ever more toward disconnection and away from close communities. Trusting no one and relying on yourself is a self-fulfilling prophecy in an individualistic world where the prevailing sentiment is “Do unto others before they do it to you.”

THE WORLDWIDE CONFESSIONAL

Maria, 20, says her mother’s motto is “Other people don’t have to know about the bad things that happen in the family.” Few in GenMe share that belief. Many think that confession is good for the soul, and this no longer means whispering to a priest in a dark booth. It means telling everyone about your experiences and feelings, no matter how distasteful.

When I asked my students to relate true stories for an extra credit assignment, I assured them they could tell their own story in the third person if they didn’t want me to know it was actually about them. Not one took me up on the offer; instead, I got myriad first-person stories, with names attached, about teenage sex, drug abuse, psychological disorders, ugly divorces, and family disagreements. One student wrote about losing her virginity at age 14 to a man who had only eight toes. So many students wrote candid essays about sex that I finally took it off the list of possible topics because I had more than enough stories. None of the students cared if I knew details of their personal lives that other generations would have kept as carefully guarded secrets.

This applies in spoken conversation as well. Jenny, 22, is an undergraduate at a small college in the South. When we met at a psychology conference, I asked about her career plans. Within two minutes, she was telling me about her broken engagement and how her former fiancé had been depressed. This was all done without pretense or embarrassment. In a mid-2000s survey of men, 62% of those 18 to 24 said they were comfortable discussing their personal problems with others, compared to only 37% of those age 65 and older. Many older people are amazed that young people will readily share their salary numbers with others, the disclosure of which once carried a strong taboo.

GenMe is also much more open about emotions. “In my generation, as opposed to my parents’ or my grandparents’, we are told to express our feelings and anger and sadness about our surroundings and not to hold them in,” says Ashley, 24. She’s not sure this is a good thing, however. “We are an emotionally spoiled generation. It can lead to more dramatic emotions when you are always discussing, sharing, analyzing them as our generation is led to feel they should do.”

But that’s not the message young people receive from most of the culture. Even sharing feelings that might muddle a situation is encouraged. In an episode of the teen show Dawson’s Creek, one character does not want to confess her romantic feelings to her former boyfriend, who is now dating someone else. “If it broke my heart, I have no right to say so,” she says. Her roommate can’t believe what she’s hearing. Clearly meant to be the show’s Voice of Reason, she announces, “You have the right to say anything you want when it comes to how you feel.”

TMI COMING UP!

Health issues are also the subject of much more honest and open discussion. Not that long ago, it was not acceptable to talk about health problems, particularly women’s health problems. I once asked my grandfather why he and my grandmother had had only one child. “Too expensive,” he said, though I knew he had made a good living. When I told my mother about this, she said my grandmother hadn’t been able to have any more children. I asked why. “All she ever said was that she had ‘female problems,’?” my mother said. It was a term I’d heard before—for a certain generation, female problems was the closest anyone would ever come to uttering words such as breast cancer, hysterectomy, endometriosis, uterus, infertility, or even menstrual period.

These days, few people have qualms about using any of these terms, especially when talking with family or close friends—or even with total strangers. Women on Internet message boards discuss everything, and I mean everything: not just morning sickness, but miscarriages, PMS, the precise appearance of cervical fluid, the color of menstrual blood (brown or red today?), DTD (doing the deed), and BD (baby dancing) with their husbands. How often, and in what position, is also openly discussed, including any problems that might have arisen—or, sometimes, have not arisen (wink). Common phrases on these boards include TMI coming up! or Sorry if TMI! TMI, for those of you who are not GenMe, means “Too Much Information” (also called an overshare). I’m convinced the phrase was coined because there is so little that is now TMI, but we need a way to warn people before things become gross. After warning about the TMI, everyone goes ahead and posts the details anyway.

These boards are extremely helpful as they provide an enormous amount of information and support to women going through difficult life experiences. They’re wonderful things—but an earlier generation of women would never have dreamed of discussing these topics in a public forum, and maybe not even with their closest friends. We live in a much more open age. Now we have not only tampon commercials, but ads for condoms, “personal lubricants” such as K-Y jelly, and erection drugs (my favorite: the one where the guy throws the football through the hole in the tire swing. So subtle).

Young celebrities seem to love the overshare just as much as GenMe’s less famous members. “When I’m alone, I do masturbate a lot,” notes James Franco. “We have sex like Kenyan marathon runners,” boasts Olivia Wilde.

Oversharing is also the name of the game on Facebook and Twitter. Although people eventually learned that posting everything on Facebook was not the best idea (pictures of you drunk at a party = no job offer), social media still provides a much more public forum for our lives than previous generations had. A survey by babycenter.com found that half of new mothers sent text messages or updated their Facebook profiles while they were in labor—once among the most private of moments. Cell phones, with their ability to take pictures and instantly send them, provide another way to overshare: snap a nude picture of yourself to entice your crush. A recent survey found that 28% of 15- and 16-year-olds had sent a nude picture of themselves by e-mail or text—and 57% had been asked to. It’s so common it even has a clever name: sexting.

GENERATION DIRECT

GenMe’s openness extends to all kinds of communications at work and at home. Some older business managers complain that young people today are too blunt. These managers say that young employees ask for instant feedback that’s straightforward and unapologetic, and give it in return. Some managers are surprised at young people’s willingness to critique the performance of older people—it’s a combination of the eroding respect for authority and the compulsive honesty of the younger generation. In a 2009 survey, GenMe’ers were much more likely than other generations to tell their manager they were looking for another job.

Young people see their directness as an asset. In one episode of the teen soap The O.C., 16-year-old Seth makes a sarcastic comment, after which his father tells him, “Watch your mouth—I was trying to be polite. You might want to give it a try.” Seth replies, “No, thanks, I’d rather be honest.” So, to some GenMe’ers, if you’re not true to yourself, and you conform to someone else’s rules, you might be seen as dishonest or a victim of peer pressure—and avoiding that is more important than being polite. For GenMe “not being yourself” equates to being somehow unwhole and false. Kim, 21, says her mother worries too much about other people’s opinions; her mother says Kim should be ashamed when she doesn’t take care with her appearance. Kim disagrees: “She should be ashamed of herself for being fake.”

One student of mine took this principle a little too far. Aaron, 22, was the kind of student a teacher dreads—well intentioned and even sweet, but unable to keep his unorthodox opinions to himself. By the end of the term, the other students were openly hostile toward him because he interrupted the class so many times. He didn’t see things this way, however. “You might view me as a ‘rebel without a cause,’?” he wrote. “But I do have a cause. It is being true to me. When I am true to myself I feel confident and content. When I am untrue to myself I feel uneasy and fake. I have to be honest with myself as well as others.” In other words, it’s more important to be true to yourself than to be liked.

Overall, GenMe appreciates directness. “The older generations are so cautious and political in the way they phrase everything that half the time I don’t know what they mean,” said one young employee. The prevalence of texting might have something to do with this: when you’re typing quickly, being blunt is easier. You also aren’t there to see the immediate reaction on the other person’s face. Smartphones are also one of the main offenders when bluntness gives way to rudeness—as you know if you’ve ever tried to have a conversation with someone who keeps looking down to text on his phone.

Even this pales in comparison to what’s done anonymously online. There, comments sections are filled with statements that often cross the line from blunt to incredibly rude. Sitting in front of their computers, commenters seem to forget that they are communicating with other people, and about other people, treating others with a complete lack of respect. For example, the new term fat-shaming describes what happens when a celebrity is photographed showing even a little too much belly or thigh—the Internet promptly lights up with overly direct statements about how she might want to lose a few pounds.

Some don’t even have to be anonymous to be cruel. Julia, 20, says, “I hate Facebook and other social networks. They have shaped my generation by making it easy to attack people and get away with it.” Jimmy Kimmel now has celebrities read “mean tweets” users have posted about them, a sign of just how widespread the phenomenon has become. Others hide behind e-mail. When psychologist Bella DePaulo was publicizing her book Singled Out, someone e-mailed her, “I love your ideas, but with a mug like that I beg of you not to reproduce. Please remain single and consider a tubal ligation just to be safe.” I don’t know if this deplorable hater was GenMe—but I do think it’s unlikely she would have said such a thing to DePaulo in person. Technology has in some ways made us meaner—or at least given us an anonymous venue for being so.

#$@%&*%$!

These days, saying anything you want often includes words you might not want to say in front of your grandmother. Whether you’re for or against this trend, swearing is clearly just not the shocker it used to be. The relaxation of the rules against swearing mirrors the same social trend as all of the other examples here—we swear because we don’t care as much about what other people think.

Sixties radicals threw around words like motherfucker because they knew it would shock the older generation. They were declaring their independence and showing that they didn’t care if people disapproved of them. Some shock value still exists, but many young people swear now just because that’s the way they talk. It proves the adage that fuck is the most versatile word in the English language, since it can be a noun, a verb, or an adjective. (Or even an adverb, as in Mr. Big’s famous line in Sex and the City: “Absofuckinglutely.”) The Google Books database proves the point: the word fuck was eight times more frequent in American books in 2008 (versus 1960), shit three times as frequent, and ass four times as frequent.

The number of four-letter words now heard regularly in movies and on television—or, actually, five- and three-letter words—has caused much public hand-wringing. Network TV began allowing bitch in the 1980s, and the 1990s brought the best gift late-night comedians ever got: the ability to say ass on TV. David Letterman liked this so much he started a segment called “big-ass ham” just so he could say ass over and over. Characters on HBO and in R-rated movies utter four-letter words as if they were being paid for each usage. In December 2013, the movie The Wolf of Wall Street set a new record for uses of f-bombs in a major motion picture: 544. People against this trend toward vulgar language often use an argument that should now sound familiar. American culture has become crude, rude, and socially unacceptable. Whatever happened to politeness and manners? Nobody cares what anyone thinks anymore. (I say @$#% them. Just kidding.)

CONFORMITY AND THE NEED FOR SOCIAL APPROVAL

Do you like to gossip sometimes? Have you ever pretended to be sick to get out of doing something? Have you ever insisted on having your own way? Before you vote, do you carefully check the qualifications of each candidate? Are you always polite? Are you always willing to admit it when you’ve made a mistake?

If you answered no to the first three questions and yes to the next three, you have a high need for social approval. You want other people to see you as a good person, and you place high value on conventional behavior. What other people think matters a lot to you.

You are also probably not a member of Generation Me. These questions are from a measure called the Marlowe-Crowne Social Desirability Scale. The scale measures a person’s need for social approval, and people who score high on it, according to the scale authors, display “polite, acceptable behavior” and follow “conventional, even stereotyped, cultural norms.” My student Charles Im and I analyzed 241 studies that gave this questionnaire to college students and children, 40,745 individuals in all.

Not surprisingly, scores on the need for social approval have slid downward since the 1950s. The average college student in 2001 scored lower than 62% of college students in 1958. Put another way, the 2001 student scored at the 38th percentile compared to his or her 1958 peers. These percentiles work just like those on standardized tests—imagine your child taking a test and scoring at the 50th percentile one time and the 38th percentile another time. You would consider her average the first time, but be fairly concerned about her slipping performance the next.

Similar results appeared on two other measures of social approval—the L and K scales of the Minnesota Multiphasic Personality Inventory. In a cross-temporal meta-analysis of 117 samples including 63,706 college students, GenMe scored lower on both scales, suggesting they were less concerned with the impression they were making and less defensive about how they would be seen. The average college student in 2007 scored lower than 79% of 1940s college students on the K scale (the 21st percentile), and 62% lower on the L scale (the 38th percentile).

I also wondered if children would show the same results—was it only college students who changed, or were kids also seeking social approval less? Sure enough, the results were similar. Children ages 9 to 12 showed rapidly decreasing needs for social approval. For example, the average 1999 GenMe fifth- or sixth-grader scored at the 24th percentile, or lower than 76% of kids in the 1960s. This is an even larger change than for the college students—you would be pretty upset if your child came home with a standardized-test score in the 24th percentile. These results suggested that the decline in social approval was pervasive: even children as young as nine showed the generational trend, with kids from GenMe scoring lower than kids from earlier generations.

The Baby Boomers began this trend. The data show that the need for social approval reached an all-time low in the late 1970s to the early 1980s. This is not that surprising—the Boomers practically invented youth rebellion in the 1960s. By the 1970s, the rebellion was mainstream, and the defiance of authority an accepted social value. Take the line yippie radical Jerry Rubin used in the late 1970s—if someone called him on the phone when he was, umm, otherwise occupied, he would say honestly, “Can’t talk to you now—I’m masturbating.”

The 1980s returned society to a somewhat more conventional existence. Slowly, men cut their hair (except for Ponch and Jon on CHiPs), pant legs went from flagrantly bell-bottom to normal (at least until bell-bottoms’ resurgence around 1996), and pot smoking declined. It was not quite as necessary to rebel to fit in—which was always a rather ironic notion. GenMe turned this trend around to an extent, no longer thinking of social approval as something to be completely disdained. But the need for social approval did not even come close to the levels of the 1950s and 1960s—those days were gone forever.

A new movement dawned during the 1980s, however, a trend that GenMe would take to new heights, leaving Boomers in the dust. Generation Me believes, with a conviction that approaches boredom because it is so undisputed, that the individual comes first. It’s the trend that gives the generation its name, and I explore it in the next two chapters.

About The Author

Photograph by Pam Davis
Jean M. Twenge

Jean M. Twenge, PhD, a professor of psychology at San Diego State University, is the author of more than a hundred scientific publications and several books based on her research, including Generations, iGen, and Generation Me. Her research has been covered in Time, The Atlantic, Newsweek, The New York Times, USA TODAY, and The Washington Post. She has also been featured on Today, Good Morning America, Fox and Friends, CBS This Morning, and NPR. She lives in San Diego with her husband and three daughters.

Product Details

  • Publisher: Atria Books (September 30, 2014)
  • Length: 400 pages
  • ISBN13: 9781476755564

Raves and Reviews

"Jean Twenge is not only dedicated as a researcher and social scientist, she is clearly passionate about it. In this forward-thinking, clear-eyed book, she immediately stands out as a social critic of substance, in a world of dogmatic and chattering media pundits who are only guessing when they are 'covering' major social trends and generational changes."

– Paula Kamen, author of Feminist Fatale and Her Way: Young Women Remake the Sexual Revolution

"In this startling, witty, and refreshing book, a pioneering researcher explains how the very personality of the average American is different....Based on careful, groundbreaking research, but filled with touching and amusing stories, this book explains exactly how the American character is changing and evolving, sometimes for the better, sometimes not."

– Roy F. Baumeister, author of The Cultural Animal: Human Nature, Meaning, and Social Life and Eppes Eminent Professor of Psychology, Florida State University

Resources and Downloads

High Resolution Images

More books from this author: Jean M. Twenge