Thursday, December 24, 2009

Single-Minded

Double X recently posted an article about a study that compared psychological well-being among singletons and those already married, and found that contrary to stereotype, most singles are just as happy and resilient as their married peers. The study, which profiled heterosexuals 40-74, left out those who were divorced or widowed, normally skewering the results of “single”. The article has some problems, however:
“When single people feel control over their lives and can rely on themselves, they can have especially high levels of happiness,” explains Jamila Bookwala, lead author and associate professor of psychology at Lafayette. She adds that the married people in her study who reported being highly self-sufficient weren’t happy about it, whereas single people on average felt relatively good about carrying their own weight.
Interesting how self-sufficiency is viewed in these two categories. I suspect that it is a point of pride for many singletons to be as self-sufficient as possible, but also in that they have to, or want to, rely on themselves for many things; that’s how the cookie crumbles, it’s just easier to do. But, when married, there is someone there to rely on, and you often just naturally fall into that pattern of needing that person to do things, expecting that person to provide something, and when that person falls short, disappointment arises. Those who are married are self-sufficient because they’ve found that they can’t rely on their spouse, and that causes unhappiness.

But of course, single is never an easy word to define:
It’s also not clear from the November study which single respondents had satisfying love lives but simply didn’t believe in marriage and which people preferred flying solo.
Neither of these designations is clear. What if you are single, have a satisfying love life, but do believe in marriage, and are just not ready for it? That seems like a hell of a lot of people to me. And while “prefer flying solo” is just a phrase, it’s too simplistic. Are these people who don’t want a relationship? Is this incompatible with having a satisfying love life?

The DoubleX piece links to a cover story from 2006 from Psychology Today on the growing shift and reduced stigma towards singles, and one psychologist actually links the current marriage craze (matrimania) to the rise of the singles. With a greater percentage of households not being filled by married couples, and with people marrying later, she posits that there are those who are insecure about the state of the union (and she doesn’t even mention the increased prominence of homosexual marriage).

Does that go back to the idea that being single is seen as a threat to those in relationships? The idea seems laughable, but somehow it always come roaring back. There are also still so many (namely lumped into the category of “relatives”) that find it strange when you don’t bring a love interest to the Christmas party every year. But I do wonder where this marriage glamour comes from. It’s become a topic of conversation among my friends, as we see so many acquaintances pair off and announce their engagement. For many, it is a confused surprise—why settle down so early? What’s the rush? I don’t know if that’s where the mocking originates, the idea to bum rush a David’s Bridal and try on a bunch of dresses for giggles. Why not? It’s an excuse to play dress up and not have to pony up the cash, to worry about the real things marriage signifies. But is it? I play along, because apparently once you hit your mid-20s, marriage is supposed to float into your head, and now we’re being forced to think about it. Dating for a number of years? Be prepared for the questions, the assumptions, the expectations.

Of course, when thinking about “singles”, that iconic show of single women, Sex and the City, comes up. The show itself did a lot to change perceptions, but it also married off three of the four women. I’m reminded of a season six episode, “A Woman’s Right to Shoes”, which explores how society does or does not celebrate or accept a person’s personal choices:
Carrie: You know what? I am Santa. I did a little mental addition and over the years I have bought Kyra an engagement gift, a wedding gift, then there was the trip to Maine for the wedding- three baby gifts...in total I have spent over $2300.00 celebrating her life choices and she is shaming me for spending a lousy $485.00 bucks on myself? Yes, I did the math.

Charlotte: Yes, but those were gifts. And if you got married or had a baby, she would spend the same on you.

Carrie: And if I don't ever get married or have a baby, what? I get bubkiss? Think about it. If you are single, after graduation, there isn't one occasion where people celebrate you.

Charlotte: Oh! We have birthdays!

Carrie: Oh, no no no no- we all have birthdays, that's a wash. I am thinking about the single gal. Hallmark doesn't make a "congratulations you didn't marry the wrong guy" card. And where's the flatware for going on vacation alone?
Exactly. Plenty of people experience major milestones that don’t fall under these traditional rubrics, but they can’t throw multiple parties every step of the way and expect gifts. Announcing a marriage can have engagement, shower, and wedding gifts, and that’s not including all the ancillary expenses! Many people also agree that we have an obligation to make ourselves happy, and that includes a lot of “selfish” decisions, ones that can be judged harshly by outsiders:
Even as singlehood is becoming the de facto norm, people who choose to go through life solo are deliberately kept in a state of confusion about their own motives by a culture that clings to the marriage standard. Typically, says DePaulo, singles are told that they are selfish for pursuing their own life goals. If you're single and you have a great job to which you devote energy, you're typically told your job won't love you back. Of course, singles are always suspect as tragic losers in the game of love. But most of all they are told through commercials, images and endless articles that they will never be truly happy and deeply fulfilled unless they are married.

"The battlefield is now psychological," says DePaulo. Single women today have work opportunities, economic independence and reproductive freedom. "The things that can be legislated are all done," she notes. "The last great way to keep women in their place is to remind them that they are incomplete. Even if you think you're happy, the messages go, you don't know real happiness." There's a hunger out there for a new view of singles.
Notice, of course, that the article goes from all singles to just female singles, again focusing on the women. Because it’s women who want to be married, right? There the ones we have to worry about. As friends of mine commented a few months ago, it’s assumed that men will marry, but for women, you never know…the men might be a little off, but the women will be downright strange!

But for many people, being single is both a choice and not a choice. It’s a choice in that a person can decide whether or not to pursue something, to set up an online profile, to ask out every person seen at a bar. But it’s also not a choice in that you don’t always get what you want, the person you want may be unavailable for a variety of factors, and sometimes, there just isn’t a suitable person available.

The Psychology Today article has some noteworthy stuff, although I don’t agree with it completely. But neither do I with another singles “movement”: Quirkyalone. The premise is basically that it’s better to be without a relationship than to settle, a feeling that many people agree with in theory. It’s meant to battle the relationship stigma, all those people who hop from one person to another. But many of these people, just like many of the people in relationships, do really believe that they don’t “need” someone. Quirkyalone is a mindset, as Sasha Cagen repeatedly declares. I understand where she’s coming from. I just do not like the label. Singlehood as a movement seems a bit silly to me, though I understand the points of privilege single bloggers point out, like tax code rates, hotel rates and whatnot.

A lot of the advice Psychology Today points to is rather obvious, at least to those of us who know the world. It might not always be feasible or easy to follow, but it makes sense. It’s what people do, it’s the natural evolution. It always seemed sad to me that when people coupled up, their social circle often narrowed, instead of expanded. This isn’t always the case, but especially with marriage, circles get smaller, because the available time one has now must be appropriately divided, and a smaller portion goes to friends. It’s part of the soulmate culture, another dangerous idea: one person can change your life, but it can’t fulfill you always and forever:
The soulmate culture insists that one person can satisfy all your emotional needs, says DePaulo. "But that's like putting all of your money in one stock and hoping it's not Enron." Marriage today forces many people to put their friendships on the back burner. Singles, on the other hand, are free to develop deeper relationships with their friends without fear that they are betraying closeness. The flip side is that singles have to be more proactive about building their social lives; it takes an effort.

"Single people are more likely to have a good relationship investment strategy. They tend to have a diversified portfolio of relationships—friends, siblings, colleagues—and to value a number of them," says DePaulo. "They have not invested their entire emotional capital in one person." Having a broad social network is physiologically as well as emotionally protective, although society perceives singles as psychologically vulnerable precisely because they lack the built-in support system of a spouse.
As I said, lots of these things just naturally happen, and they should, whether a person is single (whatever that means) or not. As more people stay unmarried, and the psychology of happiness continues to grow, there will be more studies…probably proving that what single people hate most is forcing them to answer questions about coupling up.

Are There Any Teaching Jobs Left?

I know plenty of people who have received or are in the process of obtaining teaching certificates, and while I have been told for forever that teachers are virtually guaranteed a job, it seems that is not the case now:
Since last fall, school systems, state education agencies, technical schools and colleges have shed about 125,000 jobs, according to the U.S. Bureau of Labor Statistics.

At the same time, many teachers who had planned to retire or switch jobs are staying on because of the recession, and many people who have been laid off in other fields are trying to carve out second careers as teachers or applying to work as substitutes to make ends meet.

[...]

Just a few years ago, before the recession hit, several reports had projected a big shortage of teachers across a wide range of subjects over the next several years as baby boomers retired from the classroom and the strong economy lured college graduates into fields other than education.

But the nationwide demand for teachers in 60 out of 61 subjects has declined from a year earlier, according to an annual report issued this week by the American Association for Employment in Education. Only one subject — math — was listed as having an extreme shortage of teachers. In recent years, more than a dozen subjects had extreme shortages.

Plenty of these wannabe teachers cannot find jobs, and I really wonder how easy it is to find positions, no matter if you do alternative route or get a master's degree, or one of the many other ways to enter the field. Special education is practically the only way left, as University of Kansas Dean of Education Rick Ginsberg explains in the article (disclosure: he's my father's friend), but not every teacher is made to work with special education students.

Will this reverse in a few years, if the recession dies down and people retire? Is it only true in some areas? Rural North Dakota, for all I know, still needs teachers. But that doesn't do much good if you live on the East Coast...state requirements vary tremendously.

I suspect that there are job opportunities for those with teaching degrees, even advanced ones, at educational institutions or tutoring centers. Directors, instructors, etc...they may not be straight teaching jobs, but they are in the educational field.

WaPo Fail

There was a little incident in Washington recently. An impromptu snowball fight caused a police officer to take out his gun. This turned into a big deal. It became an even bigger deal when the story was inaccurately covered by the Washington Post; their account was contradicted by other outlets and notably a YouTube video of the event.

The whole story is a fascinating example of the power of social networks, ingenuity, and journalism.

The Washington Post did write the "real" story a few days later, but by then they had been widely criticized for their erroneous coverage and for not having the balls to fess up for their wrongdoing, correcting their record properly. Their piece is pretty good, but it got lost in the shuffle between other snowstorm-related stories (especially in the print edition) and the cacophony of criticism, most notably from their main competitor, the Washington City Paper:
Yet the reason why the Post screwed this up is that they all have linkophobia. If you link to an outlet---such as, God forbid, the Washington City Paper---you've lost. You got scooped and all your colleagues are going to look down on you. Linking is a huge sign of weakness---you just can't do it. Far better to, like, call a top police official, buy his version of events, and just place it in a post, regardless of the contradicting evidence that's already posted elsewhere.

Take a close look at that 10:20 update on the maybe-gun-pulling cop: "The plainclothes D.C. police detective may have unholstered his pistol during the confrontation with participants in the huge snowball fight, based on video and photos posted on the Internet."

Bold and italics are mine. They're mine because this is the most cowardly, selfish, arrogant news conduct out there today. What the fuck is "video and photos posted on the Internet"? How does that help readers? It's as if I can go to www.internet.com, and there, on the first screen, will be the video and photos of the snowball fight and the maybe-gun-wielding cop. "Posted on the Internet" would be acceptable if this were 1997.

The reporters used this hazy phrasing because they were too chicken-shit to do something that we all have learned to do over the past, say, decade or more. And that's to link to competitors and acknowledge their contributions to stories.
The tone is harsh, but it’s a blog, much like Gawker serves to rip apart the New York Times. The truth is, Erik Wemple is right. How can you ignore the rest of the world? I assumed that it was common practice now to link to other outlets and acknowledge the competition when necessary in covering stories. The idea, as the Times has written, is that you want to be as accurate as possible, and if that means getting scooped, then so be it. You want to have all the facts, and the reporting should be stronger and as fleshed out as possible. By not acknowledging other outlets, you make yourself look stupid at best, lose credibility at worst, as seen here.

Tuesday, December 22, 2009

Are Computing Jobs the Future?

This article is quite deceptive.

Sure, it focuses on the young--teenagers, to be exact. I love the idea of new jobs in computing, just like I love the idea of new jobs in relation to the environment, or in government. But how can I get involved? I love computers and technology, though I am far from savvy and quite behind the curve in a lot of areas, but I'd love to learn. My high school did not have even close to a fraction of the opportunities listed here, and well, my problem is I never know what to do. What do I do? How can I get one of these hybrid careers:

Hybrid careers like Dr. Halamka’s that combine computing with other fields will increasingly be the new American jobs of the future, labor experts say. In other words, the nation’s economy is going to need more cool nerds. But not enough young people are embracing computing — often because they are leery of being branded nerds.

Educators and technologists say two things need to change: the image of computing work, and computer science education in high schools.

Is computing really considered too nerdy? I’ve argued before that nerd isn’t the stigma it used to be—not with Hollywood glamourizing the term, between Spider-man and Sheldon. Sure, most of us are more socially adept than these two characters, but neither of them embody computing. What does need to change is the notion that computer science is too hard, too male, and dreadfully dull. Tales of it being mundane code that will cause most normal people to go off the rails before hitting 35 are the norm, and that doesn’t bode well for recruitment.

I am all for hybrid careers. I want one. I want to be one of those hyphen people, described as writer/activist/etc/etc/etc, juggling many things and moving fluidly between interests and skills, belonging in several different environments. There are plenty of people who fit this mode, why can’t I?

This article using computing as a jumping off point for those interested in a varied career path, and uses examples of people who have untraditional background but have a “computing” job:

One goal, Ms. Cuny and others say, is to explain the steady march and broad reach of computing across the sciences, industries, culture and society. Yes, they say, the computing tools young people see and use every day — e-mail, text-messaging and Facebook — are part of the story. But so are the advances in field after field that are made possible by computing, like gene-sequencing that unlocks the mysteries of life and simulations that model climate change.

It’s seeing these simulations that really make computing cool, the data sequenced and mapped out. Museums are great for this; I recommend the MIT Museum in Boston, especially the section from their Media Lab (I was practically drooling). You need a computer science background to enter the school, something that bummed me out quite a bit when I found out in high school. Data collection is so cool, especially when sequenced and compared, trying to explain and extrapolate from the responses. It’s really unbelievable how they make even the most mundane fascinating.

But it’s hard to get there:

Today, introductory courses in computer science are too often focused merely on teaching students to use software like word processing and spreadsheet programs, said Janice C. Cuny, a program director at the National Science Foundation.

Introductory programs? I’m not sure if she’s talking about Computer Science 101 in college or the local community college’s Introduction to Computers, which is meant for grandma. I believe a lot of people just need time and practice with programs in order to use them well; sometimes that necessitates a course, other times a job or project. But a lot of people don’t get that chance, or they don’t know they have that chance and it passes them by.

I’ve always liked that the Obama administration created a Chief Technology Officer position in addition to having a Chief Information Officer, showing his commitment to technology, an area that the previous president was not into. Nowadays, technology is so incredibly important that not having the government play a part is an egregious mistake. Having support in this area—including advocating electronic health records—ensures that our citizens will be able to prepare for the future, and enhances our standing in the world.

Thursday, November 19, 2009

Double X is Done!

My new favorite site is folding back into Slate. Tears:
After some deliberation, we have decided to fold DoubleX back into Slate. The site will now become its own section, with our XX Factor blog, articles, and special projects already in the works. Our aim is to create a more intimate version of the community we have built, with many of the same voices and passions.

For many of you, this won't much change your experience of reading us. We will have many of the same bloggers and writers, and Hanna and Emily will continue to run the project. The decision is being made for business reasons rather than as an editorial judgment. In fact, it's the editorial quality of the site, and the way in which it so perfectly embodies the Slate DNA, that makes this a natural next step. This is a new phase, not an ending—since we came out of Slate, where we started XX Factor, it's a return to our roots.

While I have been lax in keeping up with the original XX Factor blog since I've made an effort to check the site regularly, I do really enjoy the core group of writers, and I love their focus on friendship--from the advice column to "Your Comeback" to their takes on everything from the health care bill to their own personal projects. Occasionally I wondered if they would ever have days where they would seem to run out of content, but on greater thought, that's silly--there is always news to decipher, situations to parse. And men's sites recycle far more than Double X will ever.

Double X wasn't a "feminist" site, as the editors of Bitch lamented this week on their podcast, but it was a woman's space--and not harsh, or overly cutesy, or any other affected attitude. They were real, but not in the "in your face real" that real usually means. They were friends, they were fun, they were smart, they were critical, and they were awesome.

I want to be Hanna Rosin or Emily Bazelon when I grow up.

P.S. Their Gabfest is also by far the best Slate podcast out there.

Wednesday, November 11, 2009

David Brooks on Government

If I were a politician trying to win back independents, I’d say something like this: When I was a kid, I had a jigsaw puzzle of the U.S. Each state was a piece, and on it there was a drawing showing what people made there. California might have movies; Washington State, apples; New York, fashion or publishing. That puzzle represented an economy that was diverse and deeply rooted.

We’ve lost that. First Wall Street got disproportionately big, then Washington. It’s time to return to fundamentals. No short-term fixes. Government should do what it’s supposed to do: schools, roads, basic research. It should not be picking C.E.O.’s or setting pay or fizzing up the economy with more debt. It should give people the tools to compete, not rig the competition. Lines of restraint have dissolved, and they need to be restored.

Independents support the party that seems most likely to establish a frame of stability and order, within which they can lead their lives. They can’t always articulate what they want, but they withdraw from any party that threatens turmoil and risk. As always, they’re looking for a safe pair of hands.

"What Independents Want"

His other recent columns have been thought-provoking, too, garnering acclaim for his take on how we frame narratives and widespread criticism for his musings on romance and modern-day technology. Don't miss his conversation with Gail Collins on last week's election, either.

Monday, November 9, 2009

“Calories should take work to access”

Ezra Klein offers another way to think about how we eat:
I'm convinced that how you lay out your kitchen changes both how you eat and how you cook. For now, I've got two main principles. First, you don't eat what you can't see, both for good and for bad. Second, calories should take work to access.

A year or so back, I was reporting out a story on the behavioral economics papers influencing the Obama administration. One of the sources for that story offhandedly mentioned a study that showed men eat far more fruits and vegetables if they're stored on the same shelf as the beer. Similarly, I've come across studies showing that storing fruits and vegetables at eye-level does more to increase consumption than subsidizing them. The irony of the crisper drawer at the base of the fridge is that it keeps produce fresh for longer, but since you also forget that the produce exists, it makes it more likely that it goes bad altogether. I've lost more produce that way than I'd like to admit. Good produce. In my fridge, fruits and vegetables go on the top drawers. The crisper area is going to get sauce overflow, or maybe bread.

Similarly, I like to snack. And I don't have much self-control, or really any self-control, when I'm around snack food. Worse, in my new place, I'll be a whole lot nearer to the fridge than I was in the large, rambling group house I previously inhabited. To keep myself from gaining a gut, I'm trying avoid storing much food that can be instantly eaten. Aside from fruits and vegetables, I'm trying to make the calories in my kitchen difficult to access: That means storing food I have to cook before it becomes edible. In my experience, the desire to not boil water is stronger than the desire to snack. That means crackers, chocolate chips, granola, and cereals are out.

In the last place, my pantry was a mess. I stored dried foods on three shelves of a fairly high, fairly deep, cabinet. Cleaning it out was a sad reminder of how much good food had disappeared beneath other foods, left to grow stiff, stale, and inedible. It was a good learning experience, though. In the new place, I had to choose between a standalone pantry or some sort of cabinet for dishes and cups. I went with a standalone pantry, as it meant I could leave it open. This one, in fact. The fact that visitors will see it ensure I'll keep it neat. The fact that I'll see it ensures I'll know what's there. At least that's the hope, anyway.

Tuesday, October 27, 2009

There's More to Life Than Marriage

I was very disappointed in the coverage of The Shriver Report, Maria Shriver and the Center for American Progress’ massive investigation in the status of women today. Despite having a cover in Time and a daily segment all last week on the NBC Nightly News, not to mention all the other reports, I didn’t find anything said to be remotely unexpected or even all that interesting (though the charts in Time were cool). Most of it was known, if you looked. I was more intrigued by the reports, both recently and in the spring, on women’s happiness; that was where the real issues came to light, in trying to theorize and explain the results.

Today, I watched a segment from Good Morning America featuring Meghan McCain and Maria Shriver discussing the Women’s Conference and by extension, the report. Normally I would not bother, since I tend to dislike daytime television in general, and despite the banner for the GMA website proclaiming the show has won the Daytime Emmy for Best Morning Broadcast three years in a row, I was not convinced that I was going to be seeing anything that spectacular. But I was mildly interested, mainly because I like both Meghan McCain and Maria Shriver.

Both women were as expected, Meghan earnest and excited, the First Lady of California very professional, but the piece was unremarkable. Until Diane Sawyer (future World News anchor!) had to ask the dumbest question of all, following Maria Shriver’s comments on choices:
Can you have a completely fulfilled life without marrying, just as a career woman?
She then points out Meghan’s age, 25, and then says the average age of marriage (first time, for women) is 25-26, implicitly implying that Meghan, who as far as the public knows is not heading down the aisle any time soon, is facing that possibility.

What an incredibly stupid and insulting question.

Of course the answer is of course, which is what Meghan gave—and to her credit, answered it directly, though with a stone face.

Why is this question still being asked? Why is it only asked of women? I have never once heard someone ask this question to a male. We ask, “Do you think you’ll get married?”, or “Would you like to get married someday?”, or “When do you think you’ll get married?”, always implying there’s an option. But for women, it is an either/or question. Why is the assumption made that all women want marriage, that it’s a good thing for a particular individual, that that is the only right choice—and that “career woman”, that 1950s term, is the only alternative? Or that you can’t be both? The whole point that Maria Shriver made 15 seconds before was that women can make different choices and that they should not be demonized for their decisions!

Some marriages are terrible. Some people don’t want to get married. Some people shouldn’t get married. People can live their lives the way they want to, and they shouldn’t be forced to conform to a set of outdated standards that supposedly promise fulfillment. Asking this question, no matter how it is answered, only reaffirms the outdated thinking behind it. If the whole point of A Woman’s Nation was to spotlight how women actually think, what they want and how they live, they can start by asking some new ones.

"The Response Has Been a Collective Shrug"

David Brooks and Bob Herbert’s recent conversation is on the war in Afghanistan, and they argue that in order for Americans to really feel they have a tangible stake in this war, there needs to be sacrifice.

President Obama has said the war in Afghanistan is a war of necessity, essential to the security of Americans here at home. If that’s the case, then I think an awful lot of us should be doing an awful lot more.

In the Second World War, those who did not serve in uniform nevertheless endured shortages of fuel, certain types of food and material goods. The nation took great collective pride in the fight against the Nazis and the Japanese. Major industries were converted to war production. Bonds were sold. Taxes were raised. There was very much a sense that “we’re in this together.”

I have always said that this needs to be true for a vast majority of Americans to really care about the war. Like what Brooks says, everyone supports the troops—and that’s all well and fine and good, but unless you are truly touched, feel that you are actively doing something on a sustained basis for those overseas and have a real stake in the outcome, then it’s really hard to feel invested in what’s happening.

Without a reason or an innate interest, people don’t care. That’s true of a lot of things, including politics. Healthcare reform is getting traction because it’s become such a big thing, affecting everyone’s wallets and choices, and those who aren’t paying attention are the ones asleep at the wheel. Why do people suddenly become invested in a subject? Often because it now has directly affected them. That’s why people suddenly start supporting disease research they previously were unaware of before their friend got sick, try to quit smoking when a family member is diagnosed with lung cancer, pay attention to school board elections when their child starts kindergarten.

Brooks offers some suggestions, one of which he calls a “civilian nation-building academy”, which would train people in the various ways they could help rebuild countries. It sounds like something out of the nonprofit sector. While there has been a resurgence of coverage on community service and volunteering (with cover stories in Time, national volunteer week, high-profile broadcast campaigns), that’s not the same as having a mobilized country or workforce. With high unemployment, volunteer numbers are up, but that doesn’t mean that it’s so easy to find a spot helping out for a few hours. And avenues like the Peace Corps and Americorps aren’t for everyone, and they only impact a relatively small amount of people.

But without sacrifice, a large mobilized movement with tangible consequences that affects a great number of people, this collective shrug won’t turn into a fiery stance. We’ve seen how healthcare reform has mobilized people, gotten them to talk and to serious think about the issue, we just need that to happen to the war.

Saturday, October 24, 2009

Why are there not more female pundits?

I've never thought of this before, at least in a conscious way.

My new favorite blogger, Brooklyn-based and Midwestern-transplant-by-way-of-DC Elizabeth Nolan Brown, tackles the question.

1) Being a political journalist/columnist, or a serious national affairs/sociocultural-type reporter/freelancer, has got to be hard (both in terms of skill level and opportunities to break in). Very hard, regardless of gender. It’s not something any writer/reporter can just do. But women, I think, have a lot more options when it comes to the range of topics, in general, they can write about and still be “journalists.” There are many, many more (paying) outlets for fashion/beauty/entertainment/sex/relationship writing than political writing. In my own fantasies of the joys/horrors of ever trying to strike it as a freelance writer, I’ve browsed through all the how-to-query sheets on media bistro, and sometimes wondered why the heck I wasn’t trying to write the fluff stuff seriously.

So for the kind of person who starts out with mild pretensions of being a serious journalist, or even just a daily news reporter, or a mildly authentic storyteller, and finds it daunting/hard/unrenumerative, etc., there’s a lot easier ‘out,’ I think, for women than for men, who, for the most part, don’t have the option of writing about healthy/beauty/fashion/etc. It’s kind of the same psychology that I think is often under-valued when explaining why women ‘opt out’ of the workplace—work can suck! It’s sometimes hard, and sometimes boring, and for people who don’t find themselves in a perfect situation, staying home with the kids full-time can seem like a socially acceptable way to ‘fail,’ to give up—one that more men would avail themselves of, too, if they could as easily.

2) Another thing is that there are very few separate “men’s issues” in politics, or media, but there are separate “women’s issues”—things like reproductive rights, gender discrimination, the politics of motherhood, media sexism, etc., just to name a few. While these should *theoretically* be things of concern to both genders, they’re not, and I can’t entirely blame men for not taking them as seriously (while I pay attention to, say, race issues, it’s not—for better or worse—something I tend to spend much time exploring in depth or writing about or anything like that; also, why would a male writer want to carve out a niche in writing about sexism, or gender discrimination, or reproductive rights? There’s always going to be a woman writer who can claim more authenticity, and some who even feel offended by a male writing about these things, so there’s totally a disincentive for them to even consider doing so).

Women have had to carve out their own spaces in the blogosphere—places like Broadsheet, Feministe, Feministing, XX Factor, Jezebel, (Ladyblog!)—to discuss these issues, separate from the “real” political issues, like military endeavors, campaigns, taxes, etc. Again, this is understandable; there are a few Big General Political Issues, the sorts that get talked about at the major political blogs and magazines, the hard news stuff, and then all sorts of non-gendered softer stuff – education, race issues, food politics – have to carve out their own separate spaces as well. There’s nothing inherently wrong with any of this. It’s just that … well, a lot of very smart, very political women writers/bloggers/pundits are naturally going to be attracted to reading about issues that directly affect them. Which means less time keeping up with the Big General Political Issues. There are only so many hours that can be devoted to keeping up with blog conversations per day, and every minute spent reading Shakesville or the Independent Women’s Forum blog means less time that can be devoted to, say, Andrew Sullivan or Matthew Iglesias. It’s impossible to keep up with it all.

I’m not someone who’s ever had any aspirations to being a Serious Political Blogger (clearly), but as someone who does want to participate in whatever small way in the conversation, who lives in DC, who hangs out with a lot of journalists and writers, and who just generally wants to be well-informed about what’s going on … even I find it daunting. So I think, yeah, this is certainly a disadvantage for women writers/bloggers who do aspire to really be out there—either you’ve got to just do the women’s stuff, or just do the Big Political Issues, and that’s got to be a hard call to have to make. [And, again, the socialization thing, but I think women who show an interest in political/sociology/media etc. are still often encouraged more to focus on social issues than on horserace politics, economics, or foreign affairs).

3) A lot of who-writes-for-where-and-about-what is driven by editors. And if an editor has two people, a man and a woman, who can write about some economic issue, but only the woman can credibly write an article about, say,the 'opt out revolution,' they're going to assign the either/or story to the guy so they can assign the women's-only story to the girl. That's certainly not sexist. But it does work against more women writing about the Serious General Political Issues.

Taken as a whole, I think women actually have many more opportunities than men to make a career out of being writers/jouranlists/bloggers. Just not necessarily writing about the kinds of things they may want to write about, or the kinds of things on which we place a premium as Serious Issues.

A couple of comments:

Yes, men can and do write about non-political matters: the mastheads of Esquire, Details, and Men's Fitness are made up of men. Granted, these positions are few and far between, but it's not totally out of the question. Entertainment journalism has plenty of men, too.

I especially second the point about why women opt out. Besides the fact that it's more socially acceptable for a mother to quit her job to care for children, if women have the means to do so, if work isn't that important to them, then why not? Depending upon the woman, this may not be an agonizing decision to make, but an easy one.

But overall, very astute analysis. The blogosphere has changed things somewhat--and I still believe things will change in the future. But even up-and-comers like Dana Goldstein still mix in social issues and psychology; they just aren't the "true" pundits. Personally, I like my commentators to be as well-rounded as possible, I like them to comment on social issues, mix in a popular song or two, discuss the cultural significance (or not) of The Good Wife. And that's done by men, too.

(Hat tip Conor Friedersdorf)

Sunday, October 18, 2009

On Feminism

I just have to spotlight this wonderfully funny, very well-written piece on how Spanx illustrates the physical illusions women go through to look attractive.
But the truth is that I love glamour. I love coquettish lingerie. I also love Häagen-Dazs, and making out, and that red polka-dot swing dress I can't quite fit into right now, and comfort, and male attention, and sometimes I think the real trick of womanhood (of adulthood, probably) is toggling back and forth between those desires without losing yourself in any one. Of course I would love to be the woman who slips on that dress and looks fabulous without ancillary assistance, but let me tell you I did give that a whirl, and I looked possibly pregnant. And while the gentleman caller may or may not have cared, I know that I cared, desperately, that I would spend the whole evening at an otherwise enjoyable get-together tugging and twisting and turning at improbable angles. And so the Spanx gave me a jolt of confidence, a license to swing my hips lustily and allow strangers' eyes to linger over my body without fidgeting and land surprise make-outs with gentleman callers, and that is a pretty smashing bargain for $10 at Target. (emphasis mine)
And that's how I've come to view feminism. The more I read--through all the critiques and criticisms, the hand-wringing and the angst--it seems to come down to balancing as best as possible your own beliefs, the necessary compromises. Because feminism alone, as a theory, isn't practical, isn't sustainable in the day-t0-day, and there always is a tension between say, wanting men's attention and not wanting to want men's attention. Or in popular parlance, the Madonna-whore dichotomy. Women are largely both, just in different spheres, at different times.

Thursday, September 17, 2009

"If I want to walk into my editor’s office and tell him I think he’s a bozo, I can.”

David Carr on journalism:

Journalists, for all their self-importance, are often a little naïve about the way the real world works. Sure, being a newsie is a grind, the hours are not great and the public holds us in lower esteem than the women who work the poles at Satin Dolls down the road from the Tick Tock in Lodi, but it beats working by a mile. Every day is a caper, and most reporters are attention-deprived adrenaline junkies who care only for the next story. Journalists are like cops, hugging the job close and savoring the rest of their life as they can.

The skills of finding out what is not known and rendering it in comprehensible ways has practical value in other parts of the economy, but the thrill of this thing of ours is not a moveable feast. The difference between a reporting job and other jobs is the difference between working for The Man and being The Man, a legend, at least, in your own mind.

Wednesday, September 16, 2009

"Wave of Incivility"

From the Washington Post's "On Faith" blog, connecting the outbursts from Kanye West, Representative Joe Wilson, and Serena Williams (emphasis mine):

All of these stories are rooted in the same basic fact: speakers who think it's all about them. And if it isn't about them, they seem to think it must be about some other individual who is even more important than they are. Apparently though, it's beyond any of the offenders' ability to appreciate that civility is about all of us.

Civility is about creating a culture of mutual respect, not simply making sure that the biggest celebrity in the room has their moment. But Serena doesn't get that, and neither do Kanye or Joe. And that's why they can not or will not offer meaningful apologies for their bad behavior.

[...]

Wilson sees the president like West sees Taylor Swift i.e. another star whose moment he stole. It's a personal thing, Wilson seems to think, so why bother apologize to his colleagues? Were this attitude not so pervasive in our culture it would be hard to believe that one could so misunderstand the moment as Rep. Wilson does.

He just doesn't get it. Wilson doesn't appreciate that House rules which ban screaming out things like, "You lie!" are not simply about protecting the man at the mic, they are about creating a culture which encourages the free exchange of ideas. When that culture goes off the rails we all suffer and that's why Joe Wilson owes his colleagues and the nation an apology.


This is just another example of how narcissistic our culture has become. Apologies have become de rigur for any sort of gaffe, but they're usually meaningless. The offenders do it because they have to, rarely contrite. Kanye's outburst was stupid, and his point--that award shows should be based on real merit--was lost. All three were disrespectful, but we are used to saying what we want in whatever forum, since we gotta express ourselves. That's our excuse; we don't mean to personally offend, you see, but we need to be heard.

Tuesday, September 15, 2009

Seriously

How is "this guy I know from Facebook" a credible source? How the hell did this get published on MSNBC???

I didn't even finish reading the article, I was so shocked.

Friday, September 4, 2009

Thursday, July 30, 2009

How Can We Reform America to Make It Healthier?

With a new health care policy in the news, and the usual track of obesity declared as the nation’s number one threat, I wonder, as I drive home after another day sitting at my desk, how we can actually change American society so that we are healthier.

Aside from the constant stream of new drugs on the market, and the new movement for locally grown food, how can many Americans—sitting in front of screens all day and all night—actually change their habits? It’s one thing to constantly carp on “calories in, calories out”, that you need to eat less and exercise more, but that’s a simple statement for a complicated problem. How can American society truly change? How can we structure work, our way of life, so that this is feasible?

I’m not talking about better health insurance, less availability of junk food, even funky workplace amenities like gyms and treadmill stations, although those things will help. But how do Americans, with their time-strapped, hectic lives, actually go about to make changes?

Many workplaces now offer incentives, or wellness initiatives, to get their employees healthy. And that’s a start. But so many people get up, drive to work to sit at a desk for too many hours, then drive home and watch TV, because that’s what they need to relax. Fitting in even a 30 minute workout (it’s never just 30 minutes), is tough when you don’t have the space, it’s cold, rainy and dark outside, and you have to make dinner and take care of the kids.

Working out during the day isn’t often feasible either, and while riding bikes to work is the hip green thing to do, it’s largely impractical for a huge number of people. There is, simply, usually too much work to do to tear yourself away from the computer for a large enough amount of time, and then we add in our own leisure reading of the news, checking emails, and doing our banking that brings us in front of the computer for more time.

People talk about changing corporate culture, but that is often very difficult to do and based on a lot of factors outside of a person’s control, especially if they are a junior employee. If everyone eats at their desk, and you don’t, it can look like you’re slacking, even if they are just checking Facebook.

A lot of the initiatives to change Americans' working habits will take a lot of time, especially if that includes redesigning the country’s transportation system. And while I am all for a reorganization of the country’s priorities regarding food subsidies, that doesn’t necessarily mean that things will change all that much (especially as fruit spoils in a vending machine). But how do we put in place things that make it easier for people to move more during the day?

The key is to make it natural, not forced, and not mandatory, because people should feel free to eat a hamburger, smoke, or sit in front of screens all day long if that is what they want. But institutional, societal changes are what needs to happen, and often in America, it is policy that pushes the rest of the country in a direction.

Even if, over time, better quality food is more equitably distributed and the country becomes less dependent on cars as a main form of transportation, we will still be captive to the screens. And yes, of course there are plenty of people who do not have to sit at a desk all day to work—teachers, construction, retail and restaurant workers, to name a few—but more and more of our jobs are sedentary, physically rote. What will happen in the future? How can we stem this tide? How can we change our environment?

I had these questions in mind when I read Megan McArdle’s interview with Paul Campos, the author of The Obesity Myth. He argues that much of what is perceived as current wisdom on obesity as the country’s leading healthcare crisis is wrong, and that the focus on obesity is harmful and ill-effective when it comes to reforming care. McArdle’s interview is partly prompted by a recent study in Health Affairs (of which I am familiar) that says that a growing number of our costs is due to obesity; prevention is understood to be by a lot of leaders one of the ways in which healthcare costs can be reduced.

At first read, a lot of what Campos says sounds blasphemous. Of course there is an obesity epidemic! How can you argue that? Just look around! Costs have soared, the rates of people on chronic medications have gone up, there's an ongoing debate about making airplane seats bigger...every day there's new evidence of how heavy and unhealthy America is.

One of his major issues is the destructive mistake people make between failing to distinguish being healthy and being thin. Despite the perception that getting thinner is being healthier, that is not the cause, and often is a form of disordered or disruptive eating, merely a symptom of a real problem and not a (truly) desired result, and that that being healthy or thin has morphed into a true moral panic.

I wonder when this started. There’s long been a historical association, at least within Western Europe, regarding body type and the availability of food, as it is with cultural perceptions of beauty—when food was scarce, being plump and voluptuous was the height of fashion (as was being pale, since it showed that you did not have to physically toil for your livelihood), whereas now not being thin is a measurement of self-control and class. “Thin” all too often equates to healthy, but many people (especially young people) have no conception that this will not last forever, and that they will eventually pay for it.

Campos is right that BMI is a flawed system, a fact that has not infiltrated the popular consciousness yet. Just like the old food pyramid, it is a distortion that is widely accepted and can actually be harmful.

Too much media coverage focuses on fitness and being healthy as losing weight, and it infiltrates down to become fact. Everyone is under constant assault about the nature of their bodies. Why are you eating that? Why are you doing this? It’s not enough/it’s too much/you’re too thin/you’re too fat/and on and on and on.

As we get older, our bodies change—as a result of age, pregnancy, stress, the environment, hormones, medication, lifestyle—and there’s only so much we can do to prevent it. It’s silly for the media to point to celebrities or athletes, because they have the resources—time and money—to afford the best care, the personal attention, the babysitter, chef, housekeeper, trainer, assistant. We can’t be Madonna, and honestly, most of us wouldn’t want to be, because we don’t want to be a slave to some figure that’s close to impossible to attain (and maintain). Even shows like "The Biggest Loser" don’t return to the contestants afterward, because it’s exceedingly difficult to go back to a normal life and sustain a major change without the help afforded to them on the show from trainers and chefs, without the unlimited time to only focus on their body and their health.

But I do question Campos on one thing about this obesity myth: What about the soaring rates of diseases like diabetes? That’s not the result of a newfangled calculation or the overprescribing of medications, and this is serious. We can argue about whether or not all those statin drugs are necessary, but I know that I do not want to be on these thirty years from now.

Public health remedies have focused on incentives to get people to adopt healthier behaviors, and that is where the idea to tax junk food and unhealthy substances like tobacco took root. But while this may work in theory, it seems counterintuitive for those windfalls to go back to merely preaching and prescribing the proper healthy behaviors, constantly reinforcing the cycle. Some smokers in particular find this galling, targeted as a result of behaviors they engage in; others welcome it as a deterrent. But as the population of smokers dwindles, it seems like a good bargain to the rest of us, because it doesn’t affect us. But widespread taxes will.

Campos returns several times to the point that the culture, with the government behind them, demonizes fat people, that the government is “abusive” when it puts these conditions in place. But we, as a society, demonize all weights. How do we stop doing that? It’s an essential part of human nature: we criticize, we gawk, we comment.

Likewise, he also makes excellent points on physical activity as it’s tied to weight. While it’s true that most publicized success stories do feature lines about feeling healthier, stronger and energized, it’s usually accompanied by other major (physical) changes. A success story that didn’t involve much, if any, weight loss isn’t interesting, because the change wasn’t physical.

McArdle does point out that no one encourages Americans to get married, get religion or move to the country, though that’s not true; it's just on a small scale, and nowhere near as pervasive as the confluences of weight, body image, and health.

It’s sadly true that fatness has become associated with “poverty and lack of self-control”, even though we are all powerless when it comes to certain foods. But self-control is too prized, and too tied with restrictive eating, so that it becomes less about discipline than deprivation.

And that’s where food porn comes in.

As I finished reading Megan McArdle’s post, I was lured into the living room where my father and brother were watching “Best Places to Pig Out” on the Travel Channel. They were calling for me, but since I heard “New Brunswick” and “food”, I was already on my way. The segment was on Rutgers University’s famous grease trucks. These sandwiches (which I’ve never had the opportunity to taste) are monstrous concoctions of fried foods stuffed in a sub: chicken fingers, eggs, bacon, burgers, gyro meat, mozzarella sticks, and of course, French fries. They are meant to give you indigestion, and are only supremely palatable to college students, who do not have to fear an upset stomach, which my father was getting just by watching. This was followed by The Heart Attack Grill, where burgers are named after coronary procedures and the waitresses are hot “nurses” who will wheel you out if you finish a triple or a quadruple. If you weigh over 350 lbs., your meal is free. (They have a scale.)

The restaurant was actually started by a nutritionist. But while the amount of lard used horrified me (I ate half a potato chip fried in lard last week and it tasted like an industrial metal), I was more annoyed with the senior patrons, who commented on their meal by merely announcing they will take an extra cholesterol pill that night, as if these pills would neutralize the effect a 1600 calorie lunch will have on their arteries.

These meals are ok once in a while, but these giant-sized gastro gymnastics are a reaction to what’s seen as a cracking down on pleasurable eating (despite the many such outlets for anyone who’s interested in food). But to say that Americans are getting healthier when a lot of evidence points to the contrary is misleading. Americans might have more pressure, more awareness of what they should do to be healthy than ever before, but they are also thwarted by both human nature and their own environment to be in optimum shape.

Thursday, July 23, 2009

The "Neg"

I’ve been reading a bunch of Conor Friedersdorf, who blogs over at True/Slant and is one of the staffers filling in for Andrew Sullivan on vacation. His post today, called out on The Daily Dish, is about the pick-up artist scene, and the very controversial “neg”, a negative statement used to pick on the girl in question as a way to lower her defenses. Conor spotlights a blog a week, and he is fascinated by one Sebastian Flyte, a 23 year-old Libra who blogged regularly about his escapades picking up women.

I am familiar enough with the popularity of this scene, partly because my brother is somewhat of a disciple. I have flipped through Neil Strauss’ The Game (known as “The Bible” to some, and it could pass for it, bound in black leather), read (and loved) I Hope They Serve Beer in Hell, and have had a few conversations on the topic. I’ve also been on the receiving end of quite a few negative statements.

Most people hearing about the technique for the first time are appalled. Of course it’s horrible! Any sort of dating trick—and the use of deception, which we all use, whether we characterize it that way or not—can be seen as terrible, immoral even. If dating is a game and everyone wants to play, of course you are out to win! Conor understands this:

I suspect that often our judgments about kosher behavior depends as much on who is involved as the specific scenario in question. A friend comes to us for advice about how to handle an awkward situation wherein she's inadvertently scheduled two dates for the same day -- and knowing she is generally an upstanding person, we laugh, sympathize, and help her formulate a solution, whereas if we were on a date with a women who deceived us about having another date immediately following ours -- or even worse, a guy our sister was dating pulled the same stunt -- the whole moral situation would seem to us entirely different.
People tend not to flip circumstances and examine their behavior if things were different. That’s because a lot of times it forces black-and-white situations into a gray zone, or merely reduces the justifications for your own behavior, because you wouldn’t want to deal with this crap if it was fostered on you. But people react out of anger, spite, and selfishness, so that’s why many so rarely seek to look at other angles.

But, I can sort of see why the neg works. Sometimes. People, when faced with a criticism, will often try to change it (if it can be changed), in order to prevent the issue from occurring, even if they do not like the person making the comment. The negative statement will reverberate back, insidiously creeping into our consciousness at random times. It doesn’t necessarily matter how true the comment is, or even if we disregard the statement—sometimes it comes back. If we are told we look angry, we will immediately try to soften our look, to prove the other person wrong.

Monday, July 20, 2009

I Wish I Could Write as Well as Emily Gould

"Vision of Love": If only I could write about my thoughts and experiences that sound quietly revelatory, as they do here; "Why I Write For Free" on analyzing and critiquing a situation and an essay.

When parsing advertorials, I wonder: Do those writers feel their souls are dying, slowly? Or they just hurry it along, just another paycheck, gotta get it done? Do they even care? Some people like writing press releases and advertising copy; it can be fun, if you make it. I often tried making games out of silly assignments just to amuse myself.

Tuesday, July 14, 2009

Back In the Saddle Again

It is quite embarrassing that I haven’t posted anything in over two months. It seems, just looking at my output, that my enthusiasm has waned in 2009. This isn’t the case—I am a person whose thoughts on a given subject far outpace any action related to it—and it is something I am always trying to rectify.

Those I know personally who read this blog know that I suffered from overwork, exhaustion, and pains in my hands and arms that essentially forced me to stop blogging for the sake of my health. I was no real writer, as I did not sacrifice my precious down time to spend it on the computer.

But I was also deeply embarrassed by my previous comments on Twitter.

I didn’t even want to write again about Twitter, having another post tagged under it. But I feel I need to redress previous comments made. Over the past two months, I grew to hate the service. It overtook my life. Companies demand to know how many tweets on a given topic are said on a particular day, and to compile these numbers is overwhelming, in a nutshell. I’ve read all the positive press, from Steven Berlin Johnson’s cover in Time (which I would have known in advance had I not been so ridiculously busy June 4, as I follow him), to well, pretty much any mainstream story on it that appeared on Google News. And I am just so fucking sick of fucking Twitter.

I tried it out. It’s too short for my liking, too much information too fast, and not a reliable way to filter through. Unlike checking email and blogs and Facebook and all our other online “chores”, I didn’t want to invest the time in it, and so I didn’t. It’s like a pet—if you love animals and reap real benefit, great. But if you have no desire to spend your time and resources on it, then don’t.

I’ve also grown to dislike the way certain industries tout its service, and how it’s become a necessity for interaction, a requirement. I want to opt out! I don’t want to be forced to take part!
I had a conversation with a friend a few weeks ago about Twitter (this was before I had grown to full-on hate it, when I was still in ambivalent mode), and we both found blogging to be far more useful. Twitter is too maniacal for her, an assault of nonsensical, mundane thoughts strewn with links. Blogs were thoughtful, occasionally insightful and filled with information and humor.

Of course, in the interim between this post and my last post, there have been plenty of stories written about this, how many bloggers have moved on to other mediums, who can't find the time, and yet, in every conceivable publication imaginable, how beneficial the service is and why you need to have one.

People use the service for different reasons—for youngsters as a way to have private conversations online, when Facebook becomes too crowded, or to find jobs, or sources of stories—but I find it an inept social tool, and I vastly prefer forms that let me wax on, connect, and share without limits or distractions. Unfortunately, as much as I want Twitter to die a quick death, it probably won’t happen. I can hope that it becomes MySpace—passé, off-putting, occasionally worth a peek for its public properties, but otherwise an ailing media property that has cash-flow problems and is too loud for most people.
***

So what else could have been blog-worthy?

I found out about Michael Jackson’s death relatively early—a coworker blasted through, announcing it. I went on Google News immediately, found nothing but cardiac arrest, and demanded proof. “TMZ! TMZ! Check it!” Still very skeptical, I did—and was met with a three-sentence item followed by “more to come.”

So the most interesting thing throughout the entire excess coverage of this exceedingly bizarre person for me was the timing and accuracy of the information, that for many people, myself included, we didn’t believe the story until it was confirmed by more traditional outlets. As the Los Angeles Times put it,

“Few people expect TMZ or Drudge or the National Enquirer to get things right or to report on issues of substance. When they do, at least so far, it’s a bit of an anomaly. So the consequences for getting it wrong among such sites do not seem terribly high. If CNN, Fox … got such things wrong, the consequences would likely be higher.
As much as people love to glee over the death of the mainstream media, we still rely on them heavily for trusted information, for confirmation and access, no matter the story. Yes, our trust in them has eroded over the decades, each successive scandal further lowering the scale, but online hoaxes are quick, and Twitter and its ilk are just as much about hype, rumor and misinformation as the high school prom. But, as much as I dislike TMZ and the ever-larger paparazzi mill, they are becoming a trusted source in their field.

Sunday, May 3, 2009

Effin’ Twitter, or The Personal Revolution

This is the second of two entries on Twitter.



“Since when has Twitter become the big thing?” my brother asked me the other week, in reference to the Ashton Kutcher/CNN “battle” that came to a head two Thursdays ago. Twitter, which has been around for a few years, was having its Best Week Ever.

I have been debating whether to join Twitter for months, even before Clive Thompson’s fantastic article last summer, which definitely put me in the “no way” camp. I have enough technology ruling my life, and I am always struck by the difference between the rat race of the internet and the slowed pace of those who just don’t give a damn. But in the last couple of months, it seemed inevitable that I would join.

I already read certain people’s Twitter feeds. It was interesting to see their thoughts on a topic, however brief, and some people were genuinely interesting. It also offered an unfiltered look, much realer than any documentary could show, at certain stars and their lives, just because they were the ones speaking, instead of through publicists or agents and interviewers. Many people, from Julia Allison to Emily Gould to Ashton Kutcher himself (5:00) have commented on this, the ability to write your own story, create your old world, a historical record if you will, without others defining you. That is a real draw, to have an authentic self out there.

But in this day and age, with “authentic” and “brand” nearly always in the same sentence, one has to practically be a brand to get any traction. Job seekers are told they have to market themselves, to think of themselves as a product or service that someone needs, and that they stand for something. Twitter takes this further: each person’s tweets, an extension of themselves, make up their essence, and that essence has to be sold. Britney Spears has an account, but it is not just her, it is the Britney brand. John Mayer is John Mayer, and while some could argue he is a brand, he’s just doing his thing. Having a Twitter, like having one’s own webpage, is considered by many to be an essential part of one’s brand.

But we all don’t need to be brands, and this segues into the way consumerism has infiltrated every part of our lives. Brands can evolve, but they really don’t. People are constantly in flux, unformed. There is much said about the constraints of growing up online, and we are seeing all the time how someone needs to take back something said or an image presented in the past, just because it doesn’t fit them anymore. Seeing discarded or old identities online is funny yet sad, a nostalgia instantly available. I wonder about all the digital graves I will leave in my life—email addresses and webpages discarded after they are no longer useful, friends and relationships that no longer have the glue they once did, but merely a thumbnail reminder that you do, in fact, know them, that you were once someone else. I think about the future of social networks (Twitter is included in the definition) all the time: Will we, as a generation, get tired of Facebook and its ilk as we grow older, finding it too time consuming? Will we get tired of being constantly connected and a new movement to go off the grid start? Will it merely be just another aspect of the web that everyone has, like email, or will it grow into its own subculture, just another thing that some people do but that others don’t?

The 140 character limit, and the loss of grammar and complete thought, is another criticism of Twitter. It is hard to write compelling in such a short space, and indeed I have had to, wincingly, used abbreviations and netspeak that I normally avoid. But like anything else, Twitter is what you make of it. I see Twitter as a place to share information. It’s different from a Facebook status update in that it isn’t some musing blasted out to 200 of your friends, but to a group of people you may not necessarily know, and that can be tracked and categorized so that strangers can read what you’re thinking. Companies, including Twitter’s founders, Biz Stone and Evan Williams, are working on monetizing this, since some companies like Dunkin Donuts and JetBlue have become success stories using the service, showing marketing and business people how to interact with their customers and drive brand loyalty. I personally do not care about such things—even links of coupons will just ennoble me to spend money on things I don’t need or want, and to get caught up in unnecessary chatter.

“Unnecessary chatter” is how those who denigrate the service would describe it. It is very true. Everyone wants to be listened to, but no one has the patience to listen to others. Following hundreds or thousands of people is time-consuming, sure, and the importance of the information received varies, yet we all want others to take us seriously, even if it’s just in jest. I often wonder, since I follow a lot of journalists, how the hell they manage to get any work done. I know I don’t, and I’ve been on the service for only a short time.

The hype has made a whole bunch of folks rush out and create an account, trying to see if they can figure out the service and maybe garner some love. Yet at times it’s ridiculous, as Brian Williams points out on The Daily Show a few weeks ago:

The Daily Show With Jon StewartM - Th 11p / 10c
Brian Williams
thedailyshow.com
Daily Show
Full Episodes
Economic CrisisFirst 100 Days

How did John McCain go from being technologically illiterate to a functional Twitterer? How in the world did congressmen not realize that snarking on the president when he’s about to give an important speech would be seen as a stupid thing? Dude, I’m already conscious that I can be found on Twitter and that if I say something wrong, it will get back to me, and I’m not a chosen representative. But I made that choice, the choice to promote myself (it is very much a marketing and promotional tool), and decided after much handwringing, to say fuck it and do it.

I agree with many, many of the arguments against Twitter, and Samantha Bee and The Daily Show, as usual, summed it up perfectly:

The Daily Show With Jon StewartM - Th 11p / 10c
Twitter Frenzy
thedailyshow.com
Daily Show
Full Episodes
Economic CrisisFirst 100 Days

Media frenzies, especially when you are somewhat involved, even peripherally, are hard to escape. But I’ve noticed that my enthusiasm has waned. It would never occur to me to list my boring activities for the day; those are better meant for people for whom it interests. It is very much a broadcasting service, but it’s not the fashion reserved for that witty away message that so dominates college life. Not enough people try to be witty on Twitter, and in some ways it’s a great stalking tool, since people you don’t know will reference who they are hanging out with and where.

Numbers-wise, Twitter isn’t anything like Facebook, especially in terms of early adopters. Young people aren’t flocking to Twitter; they may be wetting their feet now, but it was mostly business and tech people who called it home for the most part at first, since they were the ones to grab onto it as a marketing platform back when we were all figuring out what the heck a newsfeed was. Young people are getting credited, but the teenagers aren’t helping us out, since they’re still on Facebook and MySpace. But it’s a given that a technological fad would be started by those young, tech-savvy people, since those two adjectives are now best friends.

There was the little blip on the radar that Oprah now Twitters, though she made quite a faux paus on her first day (calling the twitterati “Twitters” instead of “Twitterers” and posting in all caps, the latter inexcusable), which was a giant groan to the rest of the world who know that Oprah = massive mainstream takeover. All the mothers who don’t already blog will now be running on Twitter, the thinking goes. But it turns out that large numbers of people abandon the service within a month, and Twitter has what many see as a shockingly low retention rate of 40%. Twitter does take getting used to, and it does have a bit of a bad rap; in addition, its web site sucks, and while the idea behind Twitter is simple, mastering the language and the apps and the whole culture is confusing as hell. (Hashtags, anyone?)

While it’s great for passing information, sometimes getting too much credit as a form of new journalism, it is also ripe for misinformation. Twitter can be just another RSS feed (or a series of them), or it can be a note-taking device, a sort of journal of your world, a incredibly long, incredibly complex system of notes on your life, a version of what went down when and where, what you were willing to expose and to who, what anguished you and enraged you and filled you with joy, hope, and laughter. What you loved and lost, cried over and found. Who you were, at any given moment in time.

If it is possible to encapsulate your life from every bit of online activity recorded, all the reminders, questions and problems would add up to another sort of log. This scares a lot of people and excites others; it’s all about who controls the information, and the limits of the controls that are placed on the user. We look back at the past from letters and photographs, but now we can add status and away messages to the litany of LiveJournal-like musings that take up any one of our days.

But the personal revolution of information is not just based on observations and randomness—two words that can describe the web—but on how we shape what we want to know. Facebook has predicated many of its recent redesigns on this premise, so that we get updated news reports on the Mets next to photo albums of our friends. Twitter takes this to the next level, with us following people who hand off information that we’re interested in. We’re our own personal wire service, disseminating information strictly related and of importance to ourselves. The personal revolution.

This, of course, has wide-ranging implications in all sorts of industries, from watching the ascent of iTunes singles to newspapers going the way of our own personal online mashup of news. Of course, it is not necessary to embrace the entire spectrum of the personal revolution; clearly, those that mock Twitter endlessly do not see it in the same continuum as picking and choosing what news sources and stories to follow. Twitter is merely another tool in today’s information-gathering box.
***
As my thoughts on Twitter evolved over the past couple of months and weeks, I saw its real value shift from being about promotion to one about conversation. I could follow people all I want, read their tweets without having an account. But responding, and hoping that maybe someone you think is really cool will respond to you and maybe follow you too, is the way to engage. It sounds so incredibly cliché, and it is, but it’s about choosing to “participat[e] in a large public square...to be part of a broad dialogue,” as danah boyd points out. So yeah, maybe I did want a larger audience to be subjected to my incredibly witty observations, but I also wanted to talk to and engage with those who I thought were cool for one reason or another, to see what would happen, to have my voice heard, if only by a few people on the larger issues of the day.

So to those haters, of which I was once a part: Yes, Twitter is dumb. Yes, it is information overload. But while you acknowledge that Twitter does have some real uses and has spawned real knowledge and awareness, you can’t only laud the service when it fits that purpose. Meaning, you cannot have it be a source for youth protests without having many of the same users use it to chatter about how hot or cold the weather is. People tend to talk about Twitter in its extreme forms—either as a watercooler news source for stories that are just breaking, or as a way for the bored and lonely to pretend that the people really care about what they are eating for breakfast. Yes, those examples exist, but the vast majority of tweets fall in between, and people are genuinely trying to connect to someone, even if it is under the auspicious reasoning of broadcasting to the world that you loved last night’s episode of House.

Tuesday, April 28, 2009

This is why we should have diversity on the court

The only sane voice, according to Dahlia Lithwick, is Ruth Bader Ginsburg, not coincidentally the only woman on the court:

Nobody but Ginsburg seems to comprehend that the only locker rooms in which teenage girls strut around, bored but fabulous in their underwear, are to be found in porno movies. For the rest of us, the middle-school locker room was a place for hastily removing our bras without taking off our T-shirts.
I was horrified and outraged when reading this piece, seeing how completely the judges just were in their lack of empathy. The administrators went on rumor, didn't think it worthy enough to call the parents before embarking on the humilating enterprise, and all for something that's not worth it in the least. Zero-tolerance policies--which got its root in the Columbine school shootings a decade ago last week--were always an overreaction to anything bad a kid did, and this case just brings to light how insane and inappropriate the policy has become. Many articles discussing Savana Redding's story point out the discrepancy between the laws on the books, that teenagers who willingly send provocative photos of themselves to others can be prosecuted under child pornography laws yet it's perfectly legal for them to be forcibly strip searched in front of strange adults for nebulous reasons, a point that is hampered on again and again because it's just so out of whack. Both situations cause emotional turmoil, but it is the school's responsibility that they don't sanction humilation and emotional strife at the hands of its employees.

Twitter, Celebrity, and Being Ambiently Aware

The genesis of this post is from the New York Times Magazine story on Twitter that ran in September. Most of it, including everything on Julia Allison, was written following its publication. This is part one out of a two-parter on Twitter; I felt that the second entry was incomplete without this one. I have tried to keep the spirit of my thoughts from several months ago intact, and tried not to reference anything that has since taken place.

A few weeks ago, I contemplated joining Twitter. I liked the idea of posing a question and getting responses, and it’s supposed to be great for business. But then I read “I’m So Totally, Digitally Close to You”, and said absolutely no way.

Although I’ve previously disparaged the service—a cross between Facebook status updates and AIM away messages—as being a little too connected, a little too much work for me, I toyed with the idea. It’s great for soliciting opinions, finding information, and a lot of businesses are using it this way. In certain industries, often involving the media (marketing, PR, advertising, journalism), it’s touted as a way for journalists to get a feel for what’s out there. Of course, at this point, I know one person on Twitter, and it’s an outgrowth of her job. It’s one of those things that people will join if others are on it. But I still cranked, sounding like a crotchety old fogey, “Why would anyone want to constantly update their profiles every five minutes with what they’re thinking or doing? Who has that kind of time?” It would also take stalking to a whole new level.

But Thompson’s piece, which discusses how social networking sites, specifically Twitter, are creating a whole new type of intimacy, made me think of the status message in a whole new way.
At the basic level, it’s is a version of intimacy—a version that often feels so real it’s hard to remember that’s it’s not true intimacy. Who doesn’t wish for some updated profile, an away message, something, to tide you over when you want to talk to a specific person and they aren’t there?

The overarching point isn’t new; most people find social sites to be a way to keep in touch with friends far easier, a way of keeping everyone up-to-date. They’re also great for networking, for keeping “weak ties”—those people you had class with, long-ago coworkers and neighbors—within reach.

I know for me, online contact has made my relationships richer. In addition to blogs, texting, phone calls and face time, I’ve been able to see what my friends have been thinking. This sounds like I always know everything, but this is far from the case, as very few of these channels are used frequently by any one of my friends and they rarely overlap.

Social networking sites have been a godsend to me. As a kid, I was terrible about keeping in touch—I thought about my friends, I wanted to talk to them, but translating that to action, to write a letter or to call a home line and go through parents, was the hard part. It shouldn’t be, but the privacy of the technological revolution, of everyone having their own email address, Facebook profile, and mobile line, also made it easier to have a private conversation.

Yet the biggest question of all is the future, how our generation (and future ones) will react to having most or all of their life documented. How do you erase those memories when they are up for everyone to see? Before the pictures would be stashed in drawers or albums, if not thrown away—only looked at when stumbled upon, or necessitating a move or some cathartic curiosity. Can you ever get over anyone if you are in constant touch, if their picture or profile is so readily available? Thompson touches on this, using a very common example of a break-up.

I once remember a friend of mine officially announcing her relationship on Facebook. I woke up one Sunday morning, logged onto Facebook, and saw the news. It was inevitable, both the fact they made it official and that one day I would receive major news via Facebook first, but I was disappointed that I hadn’t been told in person before it was online for the world to see. I still feel this way, but I’ve realized that finding out personally first is a rarity now; the first thing most people do when they have major news (especially of a romantic variety) is to broadcast it on Facebook. After all, we realized that they put up this notice immediately after they had the conversation. It’s the easiest way; it saves time, rather than telling all your close friends personally and then letting the news filter through, this way everyone knows at more or less the same time.

But cutting off ties online isn’t so easy, as you cannot erase or force information about other people to disappear just because you are angry, unlike the age-old image of ripping up an old photo. People grow and change, move on and move away…yet you are still connected, still able to follow along the rough outlines of their life regardless of them knowing.

Sociologists call this “ambient awareness”, being aware through constant contact, but in a sort of passive way. We don’t have to actually see people in person, talk to them on the phone; we can just read their updates and “know” them. But we don’t really know them at all; even online contact with good friends is a poor substitute for real contact, as anyone who’s misunderstood an AIM message can attest.

But online interactions open one up to the world. Feeling bored, lonely, left out? Join an online community—there are millions, at least one guaranteed to find something that piques your interest. Seriously. Sounds like a kind of heaven, doesn’t it? People who are willing to talk to you about anything, anytime, sometimes even in real time!

People suddenly seem to have more friends. Quantifying relationships would be a depressing and frustrating exercise—who goes into what category?—but luckily, social networking sites do the heavy lifting for you. The biggest benefit to all of these new relationships is that you suddenly realize that you “know” a lot of people.

Twitter uses followers, not friends, delineating that even those these people are interested in what you say, they don’t know you; you follow information about them, like a favorite star, because they are funny, but you don’t know them personally. It’s just on an incredibly micro level.

But this constant self-disclosure, the openness into the mundanity of life, can define you. Twitter, as much as any other social-networking tool, can be used to foster your identity, to essentially, create yourself as you want others to see you, in all its trite detail.

This brings me to Julia Allison.

I first heard of Julia Allison when she made the cover of Time Out New York’s Valentine’s Day issue. She’s holding up a paper that says “Call me!” and the phone number underneath is her actual number. This fact alone got a lot of press, though apparently she was already somewhat well-known to a type of New York media/tech/web/gossip follower.

Some have called Allison the real-life, Gen-Y version of Carrie Bradshaw, others a type of Paris Hilton, since she’s essentially famous for no reason.

Allison is both fascinating and repulsive at the same time, because she exhibits the type of exhibition and narcissism that is a hallmark of our generation and of the underbelly of our current culture. Her genius, as explored in the August issue of Wired, is that she marketed herself. She wanted to be famous—excuse me, “cult figure”—so she used the tools at her disposal—mainly the web—to get it. Although she has written for AM New York, Time Out New York and is some sort of consultant for Star magazine, these are merely footnotes in her biography. What’s greater is the relationships she’s exploited to become famous. She’s dated a lot of powerful media and tech types, and has written about this in detail in the way that is compulsively horrifying, then adding commentary upon commentary upon other’s criticism of her relationships. It gets very meta, very confusing in the way that is so wonderful and awful about the Internet.

As writer Jason Tanz put it, “Allison’s greatest accomplishment isn’t the volume of content she creates; it’s that she gets anyone to care about it. Her trick, she says, is to think of herself as the subject of a magazine profile, with every post or update adding dimensions to her as a character.”

Wired’s piece, in addition to another fantastic New York Times Magazine feature, this time by Emily Gould, discuss how the Internet blurs reality—how you can get so caught up with going online that your real life outside of the computer no longer feels real. It, essentially, takes over your life. The computer becomes a compulsion, a poor substitute for real human contact. It has saved and helped numerous people immeasurable, but it has also been used for much harm and pain, and we often do it to ourselves. Our little corner can get bigger and bigger, until it engulfs us, and we feel it’s the entire world, and that it’s the only thing that matters.

Internet hype, internet celebrity, does that. At the end of every season, it feels like, to turn on the TV or open a paper, that American Idol is the only thing going on, yet in a few weeks the names will have faded, and in a few years those same names will be reduced to trivia answers. Parlaying internet notoriety is a hell of a lot harder than it sometimes seems, because the nature of the beast is that information moves fast, too fast for most people to play.

But that is the way of the world today, and like they said in the ‘60s, you can “turn on, tune in, or drop out.” Dropping out never seems to last for long, as both Gould and Allison can attest, as they are suckered back in after vowing to keep their lives private. But this break should be more accurately called a respite, since that’s what it is; they never fully extricate themselves from the past they have written, and even if they did, their past is still there for anyone to find.