White really is boring

I’ll be the first to admit I haven’t been in the young adult section of a bookstore since I was about 13 and obsessed with vampire love stories. At 13, it never occurred to me that the books I loved so much, with female protagonists emblazoned on the covers, were actually whitewashed by publishing companies. Where I once saw mystery and romance, now I see rows of identical covers all featuring a white, beautiful female protagonist, even when the heroine herself is a woman of color.

Most recently, the massive publishing house Bloomsbury misrepresented protagonists of color on two different young adult books: “Liar” by Justine Larbalestier, and “Magic Under Glass” by Jaclyn Dolamore. Thankfully, Bloomsbury later re-released the novels with more accurate covers.

But it isn’t just Bloomsbury. The Book Smugglers blog lists titles from 1987 to the present that illustrate this ongoing issue.

How could publishers get something as characteristically fundamental as race or ethnicity so incredibly wrong? Moreover, how has a system so incredibly racist and colorblind gone unnoticed by bibliophiles like myself for so long?

Publishers have convinced themselves and their readers that the only thing that sells is whiteness, because there just aren’t many books featuring people of color to choose from in the first place. The young adult market seems to prefer white protagonists because they are the only ones offered, not because of an aversion to or lack of demand for protagonists who are people of color. Quite the Catch-22.

What’s distressing is that the trend of whitewashing so apparent in young adult books begins at the earliest ages of reading and development.

In a study done by the Cooperative Children’s Book Center at the University of Wisconsin, there were 3,200 children’s books published in 2013, yet only 93 of them were about black people.

 By exposing children to only white protagonists, it sparks the conception that protagonists must be white in order to be successful, and a black protagonist must be the exception and not the norm. That’s just about as white-centric as you can get.

Christopher Myers, an author of books for children and young adults, calls this paucity of black protagonists the apartheid of children’s literature.

“Characters of color are limited to the townships of occasional historical books that concern themselves with the legacies of civil rights and slavery but are never given a pass card to transverse the lands of adventure, curiosity, imagination or personal growth,” Myers wrote in a recent article for the New York Times.

Publishers come up with all sorts of reasons for featuring a majority of white protagonists. The most ridiculous reason I’ve come across is that white readers just won’t be able to relate to characters of color — which is complete crap. Young adult readers aren’t looking to identify with a character based only on their skin color. They’re more likely to identify with the personal struggles of the protagonist.

In an interview with Vanity Fair, actress Anika Noni Rose the voice of Princess Tiana in Disney’s “The Princess and the Frog” voiced a similar sentiment about what happens when black authors approach their editors with stories that have black protagonists.

“And why can’t you expand yourself so you relate to the humanity of a character,” she asked, “as opposed to the color of what they are?”

What accounts for the fact that most popular young adult novels feature supernatural creatures or other fantasy creations? Certainly not the fact that white werewolves are more popular than black ones, because that’s truly ridiculous.

To escape being blamed themselves, publishing companies place the blame for underrepresenting people of color on a lack of demand or empathy from readers. The publishers are the ones who perpetuate the system by limiting the market to mostly white-centric novels, and not because readers are demanding only white characters.

While it’s unclear how deeply the trend of whitewashing extends into more adult novels, the current situation reinforces a system of power and oppression through books marketed to children and young adults who are just developing their sense of self and awareness of the world.

A lack of racially diverse protagonists systematically reenforces the conception that a hero can’t be any race other than white. Publishing companies evade blame to save face, when what they should be doing is fixing the problem by adding more diversity to characters in young adult literature.

 

Normcore: pretentious style for the unstylish

If you’re looking to be “in” right now, search no further than normcore, aka style for the unstylish. Or, better yet, an incredibly pretentious nod from the fashionable to the unfashionable.

The term normcore was coined by New York-based group K-Hole as part of a trend-forecasting report that doubled as a conceptual art piece called “Youth Mode: A Report on Freedom.” It described normcore as a commentary on what the group perceived to be a broader societal attitude adjustment. K-Hole ascertained that the next big thing is, paradoxically, to not be the next big thing.

Normcore glorifies mom jeans, plain fleece jackets, dorky New Balance shoes and monochromatic color schemes. It’s mall clothes — things without brand names or big-name designers behind them. Apparently ’90s fashion, which should have died for good at the turn of the millennium, is now the staple for young adults trying really hard to prove that they don’t need to look cool to be cool.  

Normcore has always been an unidentified part of our culture, but prior to the glorification of being average, seemingly unaware normcore dressers were usually chastised for being totally fashion-inept and out of the loop. Now it’s all about pretending to be out of the loop that keeps you in it.

The ideology behind normcore really isn’t a bad one: Dress like everyone else and let your personality do the talking, rather than being a walking billboard for brands. The New York Times even wrote that it was a joke that became a movement. Yet, I find it hard to believe that normcorers aren’t just trying to make a pretentious statement that by rocking “normal” clothes they’ve somehow managed to become unquestionably more cool and self-aware.

If you have to go out and buy a pair of ill-fitting acid wash jeans and a mockneck, you’re not really making a statement about your individuality. You’re just buying in to what’s presently being marketed as cool.

What’s worse is trying to hide a fashion statement behind unfashionable clothes. If you want to make a statement, then make it, but don’t try to trick us into thinking you’re expressing your individuality when you’re just being trendy.

Despite the cries of individuality, simplicity and personality from those who adhere to normcore, it’s just a fad, one more highly regulated notion of what’s cool and what’s not. The issue with normcore isn’t the style itself, it’s the show you have to put on to pretend to be something you’re not. You shouldn’t have to feel pressured into purchasing what’s trendy in order to make a statement, or even just pretending not to be trendy to make that same statement.

There’s no individuality when everyone else is doing it, too.       

For the real normcorers out there, those who wore nondescript outfits before an annoying tagline was used to define you, I applaud you for your utter lack of style and your resistance to the notion that we have to wear what is marketed to us in order to be cool. You’re the guys who wear fanny packs because they’re useful, not to make an underhanded statement that you’re hip even though you’re pretending not to be.  

To the guys going out and picking up a new pair of white sneakers similar to the ones my grandfather rocks on a daily basis, please stop. We all know you’re faking it and soon enough, hopefully, such shoes will be retired to the farthest depths of your closet in favor of something a little more stylish.

Individuality doesn’t have to be derived from your clothes or a skewed perception about the latest trend, because it’s the attitude behind what you wear that truly makes you stand out.  

Off the grid is definitely off my list

Living off the grid almost sounds appealing to me: Less noise, less chaos, being one with nature and reducing my carbon footprint.

I say “almost” because I know my romanticized notion of the off-the-grid movement — popularized by films and literature such as “Into the Wild” — is seriously misconceived.

It’s not even the idea of no indoor plumbing that has me balking at the thought of living in my own cabin in the woods Walt Whitman-style; I’ve been camping and I figure I could cope. It’s not the lifestyle I find so objectionable, but the mentality that accompanies it. You have to be totally dedicated to the idea of living off the grid for this lifestyle to be successful.

Yet, this dedication to leaving the modern world, disconnecting from people and completely isolating yourself only works in theory. One of the central tenants of living off the grid is calling attention to our impact on the world through our continued reliance on the fossil fuels and pollutants that power our homes, cars and industries. The message: Leave the system behind, begin again and save the planet from human-caused destruction.

But why leave society altogether as a means to promote change when there are so many other active, collaborative alternatives that don’t require you to join the next generation of homesteaders?

You’re making very little impact on anyone other than yourself, and perhaps immediate family members who question your sanity. Yes, having your own solar panels and drinking out of lakes may reduce your personal impact on the world, but it also reduces the impact you could be having if you were still part of society.

Living off the grid is a fad perpetuated by people who are disillusioned by our system and style of living. According to Nick Rosen, editor of the site off-grid.net, the off-grid movement started because people had an ideological motivation to take their environmental impact into their own hands and reduce their mindless consumerism.

Rosen also asserts that after the financial crisis of 2008, more people turned to off-gridding because it was a viable means to take care of themselves when the government appeared incapable. He estimates there are roughly 2 million people living off the grid in the U.S.

Rosen has a good point. You can take care of yourself when you live off the grid, but that’s about it. It’s selfish to believe that living off the grid will make more of an impact than advocating policy change. Instead, you could work to create environmentally friendly start-ups, or even engineer new ways to make clean energy affordable to those who can’t drop out of society to pursue a personal ideology that confuses selfishness with concern for the environment.

There are plenty of people making a difference on behalf of the environment who haven’t felt the urge to drop off the face of the earth. Can you imagine if someone like Elon Musk,CEO and mastermind behind Tesla Motors, decided to live off the grid instead of working toward a social revolution? Musk is making a huge impact on the way we think about cars and the automotive industry, something he absolutely could only have accomplished as a part of society.

It might be tempting to turn to the idea of off-grid living, because it does offer an interesting alternative to the rat race we all feel caught up in sometimes. However, my interest in such an alternative lifestyle dwindled when I realized that my goal in life was not to make a self-satisfied claim about my carbon footprint, but to make a tangible impact in some way or another. That’s not something that can be accomplished by living an off-the-grid lifestyle.

Withdrawing from society to chase a selfish ideology isn’t the way to change the environmental disaster we’ve gotten ourselves into. We have a collective problem that needs a collective voice to remedy it, not a fragmented, self-righteous counter-culture.

We’re Addicted to Self-Perpetuating Loneliness

A friend of mine recently posted a link to an interesting video on Facebook about the effects of Facebook and other forms of social media on the brain. The video was originally posted by Cam Lincoln on mobiledia.com, and in it he describes a phenomenon we’ve all come to subconsciously realize: “I share, therefore I am.”

As we become more and more intimate with our technology and social media platforms, it’s important we realize the things that seem to connect us are also making us lonely.

On the internet, we get to create our best selves. We spend hours crafting and editing tweets, status updates and even text messages. We can choose which Instagram filter best suits a photo and the Facebook profile picture that makes us look the best. The cost of this excessive preoccupation with creating our best selves is a connection to reality.

In a study done by the International Center for Media & the Public Agenda, college students reported feeling addicted to social media such as Facebook and even facing anxiety and depression when asked to refrain from using them. When it comes time to face reality, we’re more inclined to immerse ourselves in a world of virtual friends and perfect images, leaving us with less time to experience real life.

But we don’t get to edit real-time conversations, and we can’t go back and touch ourselves up in everyday interactions with other people. There’s an unhealthy obsession with creating and selling a false image, even if we just think it’ll make us appear more interesting. Our addiction is to the instant gratification, the validation we think we’re getting when we get “likes” on a picture.

Margie Warrell a blogger for The Huffington Post calls this being seduced by social media.

“They seduce us with the implicit promise that, if we get enough friends or followers or likes, we will feel truly significant in the world,” Warrell writes. And many are falling prey to this promise.

According to thenextweb.com, Facebook has 1.19 billion monthly users and 728 million daily users. More and more people are connecting with each other, but what value do these connections actually hold?

Lincoln’s video suggests that a human cannot physically and emotionally connect with more than 150 people on an intimate level, but last time I checked, my friend count was two or three times that. Intimacy is a natural part of being human, but Facebook only offers the illusion of such connections.

Our hunger for more friends and more connections draws us to sites like Facebook, but reality suggests that more virtual friends, rather than real life relationships, really aren’t the key to relieving us of our loneliness. When we add friends on Facebook, especially friends we don’t even know well in real life, we’re not seeing their profile as an honest picture of who they are.

A Facebook profile offers a limited, edited version of a persona that the person has created. And yet, we revert to Facebook time and time again to learn more about people, to get glimpses into their “personal” lives that often misrepresent who they actually are. We get hung up on details about a person’s life without realizing that we’re never privy to the whole story.

It’s the conviction that what we’re getting on Facebook is real that drives our sense of loneliness. Studies have shown that people who use Facebook more tend to be more depressed afterwards because they feel inadequate compared to their friends. On the same note, people may feel lonelier even with so many connections because they realize there’s nothing truly worthwhile or real about the friendships they only maintain through Facebook.

Internet profiles and personas may make us feel interesting and get us more likes, but they don’t establish emotional connections or lasting intimate relationships between people. We’re making ourselves lonelier by continuing to believe that real relationships can come from a simple “add friend” button and that we can really know someone just by what they post online.

Online Classes Give us Opportunity and Hopefully Fewer Boring Lectures

I dare you to try and count the number of times you have been on the Internet within the past 30 minutes. Try remembering how many times you’ve glanced down at the shiny iPhone superglued to your hand, or used that tablet to “take notes” during class.

I pose these challenges not because I resent technology — I am just as guilty of constant device-use as any other college student. It’s our educators who need to realize this trend. Many professors ban the use of laptops or tablets in class, and speak longingly of the days before “eBooks” and “online homework.” But they’re missing a key piece of the puzzle: online is opportunity.

Pearson released a survey this month titled, “Grade Change.” The survey examines whether we are embracing the digitalization of our higher education system by looking at the attitudes of professors and educators, rather than those of students.

Despite the wealth of resources available online, the survey points to a crowd of educators stuck in the past. Thirty four percent of those polled said that online learning “was not critical to their long term strategy.” This is disheartening, considering the survey also reported that almost 7.1 million students in the U.S. are taking at least one online course.

The numbers show that students are embracing online education, but educators haven’t yet learned that there are more resources and greater flexibility available online than in a typical classroom.

For instance, look at the success of sites such as Khan Academy, which provides simple yet effective tutorials on everything from elementary algebra to differential equations. Students are not trapped in a 50 minute class period. They can view materials as many times as they need and engage in online discussions with their peers, to solve complex problems.

If Khan doesn’t work, there are thousands of other sites and tutorials available for free online.

 

Moreover, online learning can be self-tailored; if you want to take a class at 3 a.m. because it’s when you’re most awake, you can. If you don’t understand a math problem, you can watch an instructional video more than once. Learning is not the same for every student, and online education provides opportunities for each student’s distinct learning style.

And yet, traditional college classes are still defined by the lecture format — a completely antiquated method of teaching that better corresponds with the era of manually catalogued library books and late nights spent pouring over dusty textbooks.

Professors who still rely on this “face-to-face” method of teaching are skeptical about online learning because it creates a “disconnect” between student and teacher. But this “disconnect” is already prevalent in 400-plus person lectures.

Ashley Lykins, a pre-pharmacy freshman, said she has similar feelings about teacher-student engagement in large lecture classes.

“I think it really depends on the professor,” she said, “but for the most part they just talk at you.”

It is time for higher education in the U.S. to move forward and embrace the resources available to us through digital classrooms and online learning. If professors are worried they will lose the intimacy of a “real” classroom setting, they have yet to discover webcasting, Skype or FaceTime.

If they are worried about the efficacy of their teaching, it’s because they have yet to realize that the Internet is a massive ocean of untapped resources and educational opportunities waiting to be explored.

Online learning does not diminish the efficacy of good educators, rather, good educators are the ones who take advantage of the opportunities presented by teaching in an online setting and using digital integration in a physical one.

Our generation of students is the most plugged in to ever attend college, but this doesn’t have to pose a threat to the professor who is willing to adapt. It’s an opportunity to explore the possibilities of putting more classes online, of embracing the digital era and of reconditioning the old teaching system to meet the growing needs and demands of its students.

Lifeline Laws: the Good, the Bad, and the Preventable

It’s no secret that many UA students like to have a good time, whether it be at a kickback, house party, or even a frat bash. It’s also not a secret that copious amounts of alcohol are usually involved in such festivities. Most of the time things go great, but on a not-so-great night of binge drinking and poor decisions, things can often turn out really, really bad. What’s worse is the frightening realization that you could get into trouble simply by getting help for a friend in need.

Binge drinking is a pervasive trend among college students, and the Center for Disease Control reports that 11 percent of all alcohol consumed in the United States is by adolescents between the ages of 12 and 20. For this group, 90 percent of their drinking occurred as binge drinking, meaning that five or more drinks were consumed in about two hours for men or four in about two hours for women.

Binge drinking can result in alcohol poisoning or, if left untreated, death. In a study by the National Council on Alcoholism and Drug Dependence in New Jersey, you start accumulating risk at a blood alcohol level of 0.30 and it only gets worse as you drink more. Unfortunately, judging just how intoxicated you are is no easy task, especially when you factor in variables such as time, hydration, and weight.

While underage drinking is not going to vanish, we can push for legislation to support and protect students who need to report serious accidents or overdoses. Currently, Arizona does not have such “Lifeline” legislation, but it has been successfully implemented in Indiana and Colorado, where underage drinkers are protected from criminal prosecution for illegal possession or consumption if they call 911 and ask for medical assistance, and they provide their names and remain on the scene to cooperate with the medical and law personnel.

When students are faced with the choice between letting a friend “sleep it off” or getting the notorious minor in possession slip we’ve all come to fear, it’s no wonder that many students would hesitate to act, further endangering the life of someone who is severely intoxicated or even blacked out.

ASU conducted a random survey of 6,000 undergraduates and 1,500 graduates and asked what would compel them to seek medical attention for someone passed out or incoherent due to alcohol. 35.5% of those surveyed said they feared getting their friend into trouble, and 47.6% said they wouldn’t even know what to do. In a life or death situation, hesitation is the worst reaction to have.

While the Lifeline Law is not a surefire way of avoiding an MIP or other repercussions of underage drinking, it is the most effective way of protecting underage drinkers in situations of medical necessity. When properly in place, the Medical Amnesty Protocol at Cornell University reported that more people called for assistance and there was less fear of getting into trouble that forestalls aid.

 

Michael Rabbani, a freshman studying environmental sciences and business administration, readily agreed that safety must be the biggest priority for students in trouble.

He said that “by giving minors leniency in a situation like this, it would take away any hesitation they have to call authorities for help.” When you’re not scared to call for help, it’s that much easier to make a responsible and crucial decision. Right now, Arizona law puts more lives at risk than it protects. It is clearly illegal to drink if you are not 21, but this fact is continually ignored by the pleasure-seeking party-goer. Rather than punish responsible actions, we need to protect our students with laws that will encourage them to seek help and realize that irresponsible binge drinking isn’t the way to have a good time in college.

A Lifeline Law is one of the greatest gifts we could give our students, and as a solution to an already prevalent problem, it encourages responsibility and promotes health and safety even in a crisis situation.

Sometimes Big Data isn’t Equivalent to Big Brother

We’ve all been taught that privacy is one of the finer things in life. We know leaving the bathroom door open is not socially acceptable and that keeping our ATM PINs from strangers probably decreases our risk of getting robbed. What we often don’t realize is that privacy is effectual less often than we imagine, especially on the Internet, thanks to something called “big data.”

Big data is the next big computational trend, providing more data to companies and advertising agencies than is humanly possible to imagine. Perhaps the easiest way to think about big data is as the strategic information about ourselves that we litter across the Internet. Think of your last Google search for “how to grow cacti in Antarctica” or the Facebook page you liked about cats in weird places. Less than 24 hours later, cats and cacti are showing up on your Facebook news feed, and Google is suggesting the best places to make such purchases, all thanks to big data.

Even tech giants like Microsoft Corp., Intel Corp. and Oracle Corp. have a difficult time describing what big data actually is, which makes it all the more tiresome for consumers like us to try and figure it out for ourselves.

But while we can’t comprehend everything about it, not all big data has to be bad data.

When you consider the amount of data a single site like Amazon or Google has access to, it’s easy to understand how people might be concerned about a “stalker economy.” In fact, according to an article by Jerry Michalski of Forbes, Facebook is valued at $100 billion because it is a venerable treasure trove of your personal information, voluntarily put online by you.

Google openly admits to collecting data from devices, like login information, location information, unique application numbers, cookies and other anonymous identifiers. Amazon has access to 152 million consumer accounts, complete with spending and viewing habits.

But while big data may seem like the next Big Brother catastrophe, it’s little more than the evolution of our consumer society.

 

Imagine the ways that safe, controlled harnessing of big data could benefit our everyday lives.

Kord Davis, author of “Ethics of Big Data,” argues that big data presents us with a chance to analyze and assess the human condition like never before. Can we predict economic trends earlier and with more accuracy? Can we expand participatory medicine or predict epidemics before they wreak havoc? Can we look at the data and figure out ways to improve our education? These are questions big data has the power to answer.

And, of course, big data enhances our shopping experiences. The ads you see are targeted directly to your purchasing habits, streamlining your consumption and ridding your browser of those aggressive pop-ups.

Even Netflix relies on big data aggregation. Without it, there would be no “Popular” scroll bar or suggestions tailored to your viewing history. By looking at the statistics of what and how much you watch, Netflix has created a site personally tailored to your viewing preferences. You certainly can’t say that about your now-defunct neighborhood Blockbuster.

Yes, big data is an aggregate of everything data-mining companies want to know about you, from your spending habits to your love interests, but the advertising companies that use this data never really know you.

Your habits are simply analyzed by an algorithm that treats you in the same manner as the thousands of other people who have roughly the same online habits as you.

The collection of data you willingly post on Facebook, Google, Twitter or any other site is so vast that you are merely a speck among millions, if not trillions, of other data points.

While we are accountable for protecting our sensitive data, there’s no use in worrying about data collection from companies seeking to sell you new products. Big data holds the key to consumer trends and preferences, and it pushes us to analyze and assess the human condition in new and exciting ways.

We should embrace what it has created for us so far: a streamlined, consumer-centric economy based on us and our preferences.

Olympic Women are more than Sex Symbols

With the XXII Olympic Winter Games in full swing, it’s a time for excitement and the hope that people will come together peacefully and show their passion for competitive sports. It is a time to celebrate astounding athletic talent regardless of race, nationality, religion or gender.

This year, women’s ski jumping is included in the games for the first time. In the past, many sports have been reserved for men because of ridiculous worries about the uterus falling out or being damaged. It seems like we’ve finally made progress toward gender equality in sports. Unfortunately, that’s jumping ahead of where we really are.

Female athletes should not be subject to societal norms that pressure them to choose between being feminine and being athletic, nor should they be punished if they do or don’t fit stereotypes.

For female Olympians, their talent alone is not sufficient for viewers and sponsors. In order to get endorsements to continue training, they still need sex to sell. But what about ability? They’re some of the greatest athletes of our time, but not seen as such. Instead, they’re “sexy and athletic” women who just happen to have a gold medal or two.

As the world watches competitions like women’s figure skating, their eyes will not be on the talent displayed by the athletes but the amount of sex-appeal they ooze in skin-tight, hyper-sexualized uniforms designed to play up all their best assets. Rather than enjoying the moment at hand, these women must worry about not only their appearance, but also about their performance.

Mikaela Shiffrin, an alpine ski racer, said in an interview with “The Today Show” that she spends at least 30 minutes putting on make-up and doing her hair, because for the few moments after she takes off her helmet, all cameras are on her.

Shiffrin’s appearance in those photos or interviews can mean the difference between landing a sponsorship deal or missing it. If she were a man, there would be no question about sponsorships or her physical appearance.

 

Female athletes blur the line between athleticism and femininity, but their talents should not be overshadowed by looks or socially demanded justification for how women compete or what they look like doing it. They’re subjected to the highly dichotomic and sexist system of being beautiful or “butch,” with athletic ability coming second. This limits women to a cycle in which they fully recognized for their athletic talents.

Lolo Jones, an Olympic hurdler turned bobsledder, faced a scathing attack from a New York Times journalist who felt she gained too much air time for her looks, rather than her performance.

However, if a woman is not sexual enough, she’s labeled “butch,” or her gender is even questioned and tested. Middle distance runner Caster Semenya’s gender was publicly scrutinized during 2012 Summer Olympics because of a gender test she took in 2009 when she did not fit the “sexy woman athlete” standard. Rather than a celebration of her massive victories and multiple new records, old rumors began circulating again.

The sexism faced by female Olympians is an unfortunate mark on a competition that is supposed to celebrate talent and personal triumph. Such norms perpetuate femininity as one of the best qualities a woman can have, and not being feminine is an obvious deviation from what society considers normal.

Title IX, passed by Congress in 1972, finally initiated the slow and painful change of such binary norms. While women are now allowed to compete, they’re still measured against the achievements of their male counterparts, and female-dominated sports never receive the same hype or publicity that male dominated sports are entitled to. It’s not that female athletes are not worthy of the same reverence, our sexist norms just perpetuate such discrimination.

The Olympics are supposed to be a celebration of athletic talent and ability, not a double-edged sword to hurt, shame or exploit its female participants. There needs to be less focus on what women look like and more on what they do and who they are: Inspirational role models and true competitors in the games.

Change.org Really isn’t all it’s Cracked-up to be

In less than six years, the online petition site Change.org has become a sleek, efficient way of creating a petition and garnering online signatures to support your cause — but it’s also created a new form of lazy social activism that’s not nearly as impressive as it seems.

The company was founded by Ben Rattray. Rattray is one of Time Magazine’s 100 Most Influential People of 2012, employs 175 people across 18 countries, according to CNN Money, and has touched the lives of millions more.

Rattray has harnessed not a growing sense of activism from the international community, though, but the power of an anonymous collective that can support just about any cause while sitting behind computer screens. The website boasts more than “62,534,238 people taking action” — becoming social activists in all of five minutes.

While the amount of attention the website receives is undoubtedly impressive, according to Forbes, roughly 15,000 petitions are created monthly by average people who hope their five-minute creation will lead to the next big grassroots movement. Only a few, like Trayvon Martin’s mother’s petition to bring criminal second-degree murder charges against George Zimmerman, received massive support. The question is, though, are all of these people really passionate about change, or are they simply bored and looking to stir up the pot?

While the idea of the Change.org platform is to provide an easily accessible and free place to voice your opinion and promote change, I’m just not sold. The site plays to a generation infatuated with quick returns and easy gratification that diminishes the need to actually leave your house. An online signature means nothing if all you do is sign and forget about the cause you’ve supposedly committed yourself to.

It’s all too easy to get wrapped up in the convenience of Change.org and stay safely seated in your chair, believing you’re an aspiring social activist, all for signing a few petitions.

Perhaps Martin’s petition brought her son’s killer to court, but in the end, it played no part in the ruling that Zimmerman wasn’t in fact guilty. The petition is simple and easy to add your name to, but it’s also incredibly limited and perpetuates a standard of lazy activism that can be logged in and out of at the click of a button.

 

I’m also not sold on their practice of making money from advertisers who are diametrically opposed to the fundamental, liberal values the site was created on. In his CNN interview with Adam Lashinsky, Rattray describes the site as “a social good business … a business that is dedicated not to maximizing profits, but maximizing the impact we have on the world.” Rattray claims that his company isn’t just concerned about shareholders but also stakeholders, namely the impact the site has on the community, environment and the lives of its employees.

The fact that it’s making profit isn’t the problem; obviously, the company needs a way to pay its employees and keep things running. The problem is that some of its profit comes from corporations and conservative campaigns that don’t fit the progressive image the site touts, according to The Huffington Post. Its willingness to appeal to conservative and Republican customers, while still promising that it’s a socially liberal platform dedicated to positive public change, creates an identity crisis that pits social liberalism against the promise of good profit.

True activism comes from actively living what you believe. The point of activism is being active for a cause you are passionate about. There is power in the anonymous collective, but there’s also something infinitely more rewarding about actually living your values, rather than offhandedly signing an online petition and logging out of your social activism at the end of the day.

“Just reading the headline and signing a petition is not social activism,” said Stephanie Choi, a freshman studying English, “but people can become activists through this site by finding the petition that catches their heart and working to promote that cause on their own.”

Change.org was an impressive idea, and perhaps it’s a good way to initiate the first steps of a larger social movement. But in the end, it’s not a simple online signature but an active and passionate stance that will inspire the most change.

Give me a Break aka Stop Being so Sensitive

The definition of solipsism is a preoccupation with one’s own feelings, which is exemplified by the overuse of trigger warnings on the Internet.

If you’ve surfed the web long enough, or even paid attention to Tumblr, you know that a trigger warning is often used to advise an audience of ensuing graphic content. This could be anything from eating disorders to rape, but the message is always supposed to be the same: We care about your feelings and sensitivities, so we’re here to warn you about all of the bad stuff.

Unfortunately, this doesn’t actually work the way it’s supposed to. According to newrepublic.com, trigger warnings can be provided for just about anything at the request of just about anyone. At the University of California, Santa Barbara, students recently passed a resolution to include trigger warnings on syllabi, letting students know ahead of time what kind of content they’re going to be exposed to.

Yes, it’s sensitive, but it’s also incredibly short-sighted.

Trigger warnings are problematic from the outset because they blur the line between genuine sensitivity and downright censorship. Where do we stop having open discussion about the implications of complex topics displayed in the media and start limiting ourselves solely to our comfort zone?

Reading about rape may be especially difficult for a rape victim, but when we most need to talk about and deal with rape and raise social awareness, we’re shutting down the conversations before they’re even allowed to take place. Trigger warnings should not be an excuse to just skip over sensitive topics, especially at the collegiate level.

In another instance, students at Wellesley College protested a sculpture of a man in his underwear because it triggered thoughts of sexual assault. They demanded it be moved inside even when the artist explained that it was merely a representation of sleepwalking. A piece of art and personal expression was hidden from the rest of the student body because of triggers not associated with the intent of the piece itself. This incident isn’t reactionary sensitivity, but a stifling of artistic creativity and expression because of the rampant solipsism of a select group of students.

 

While the original intent of trigger warnings was to prevent survivors with post-traumatic stress disorder from suffering panic attacks or uncomfortable flashbacks, their widespread use is far less virtuous. There is no way to know how every person is going to react to every possible scenario and every reaction will be different. Without the ability to anticipate reaction, we’re not doing ourselves any favors by limiting the things we read or experience in the real world. Life does not come with trigger warnings, and neither should anything else we’re exposed to.

We are becoming so preoccupied with our feelings that we forget how to learn from inflammatory or sensitive material. Novels like “The Color Purple” or “The Kite Runner,” both of which include explicit themes such as rape and domestic violence, are also incredible stories of strength and personal development.

Trigger warnings aren’t protecting us. If we were to cite every seemingly explicit scene or insinuation in every classic novel, we’d be left with a short list of culturally irrelevant novels that teach us nothing about society or the realities of life. They’re a catalyst to stunt our own growth as students and as individuals who must deal with our share of trauma. We should be pursuing open and informed discussion about these topics not ignoring or brushing aside the issues. There is no way to address every single concern for each person.

The cure to solipsism is discourse, something we can’t have if we’re too caught up in posting trigger warnings along the way.