Bonfire of the Inanities

Courtesy NYPL

The story arrived in November of 1992—more than a year after the video for Nirvana’s “Smells Like Teen Spirit” premiered on MTV’s “120 Minutes.” It was nine months after the Toronto Star asked: “Why is Seattle the rock capital of the world?” It was two months after the St. Petersburg Times told everyone’s grandparents that “the scene is dead.” That’s the moment that the New York Times finally went big on grunge—a trend that reporter Rick Marin called “a musical genre, a fashion statement, a pop phenomenon.”

In “Grunge: A Success Story,” Marin summed it all up:

This generation of greasy Caucasian youths in ripped jeans, untucked flannel and stomping boots spent their formative years watching television, inhaling beer or pot, listening to old Black Sabbath albums and dreaming of the day they would trade in their air guitars for the real thing, so that they, too, could become famous rock-and-roll heroes.

But the real absurdity, Marin suggested, lay in the fact that the entire “trend” of grunge was a fabrication, and he carefully unpacked the ways in which the media had built up the story of a trend.

A Plug for the Leek

Pity the poor onion. It is ubiquitous but always in the background, a key but supporting player in nearly every savory dish. It is the bassist of the food world: without onion, food tastes tinny and lacking, but nobody really wants to eat it by alone. This is a mistake, because the onion and all its allium relatives have a flexibility that few other vegetables have: a texture that can be either crisp or luxuriously soft, a flavor that can range from pointedly savory to sugary sweet, and an unusual physical structure that can be molded into whatever shape the cook desires. Of the alliums, my favorite, an underused sandy gem of the vegetable kingdom, is the leek.

Leeks aren’t fantastically popular here, but they should be, because their flavor and texture are like a refined, grown-up version of regular onions. Their individual leaves are thin and delicate, almost like noodles, and they can be used to add onion flavor to dishes that would often simply end up tasting like onion if a typical yellow or red onion was used. That’s why they’re often used with mild primary ingredients like potatoes and eggs; they augment rather than overpower. But they have abilities far beyond the supplemental.

Leeks are ridiculously hardy plants; their season is actually just beginning now, and some say their flavor will actually peak sometime in January, when most other plants, like most New Yorkers, have given up and are just Seamlessing falafel every other night. Leeks also grow in an interesting and very artificial way; though they look like stems, the part we eat is actually a tightly curled bunch of leaves, kind of like brussels sprouts. When they grow, to cope with the garbage winter weather they love so much, the part of the leek exposed to the elements becomes tough and hard and inedible—so farmers have to actually keep topping it with soil, leaving only an inch or two of leaf exposed to the air, in order to maximize the amount of leek that remains underground, pale white and delicious. Interesting, right? Bring that cool fun fact up at your next party.

The Internet’s Invisible Sin-Eaters

In this month’s Wired, Adrian Chen visits the Philippines to speak with professional content moderators—the people who scrub all the dick pics and beheadings from the world’s biggest sites before they reach users’ eyes. It’s job that, he says, “might very well comprise as much as half the total workforce for social media sites.” Sarah Roberts, a media studies scholar at the University of Western Ontario focusing on commercial content moderation, is quoted in the piece. They caught up over chat.

AC: One thing I would have liked to include in my piece was how you got interested in studying content moderation.

SR: Well, it’s a pretty simple story. I was perusing the NYT one day and there was a very small story in the Tech section about workers in rural Iowa who were doing this content screening job. They were doing it for low wages, essentially as contractors in a call center in a place that, a couple generations ago, was populated by family farms. I call it “Farm Aid Country.” I say this as a born and raised Wisconsinite, from right next door.

So this was a pretty small piece, but it really hit me. The workers at this call center, and others like it, were looking at very troubling user-generated content (UGC) day in and day out. It was taking a toll on them psychologically, in some cases. I should say that I’ve been online for a long time (over twenty years) and, at the time I read this, was working on my Ph.D. in digital information studies. I was surrounded at all times by really smart internet geeks and scholars. So I started asking my peers and professors, “Hey, have you ever heard of this practice?” To my surprise, no one—no one-had.

This was in the summer of 2010. Right there, I knew that it wasn’t simple coincidence that no one had heard of it. It was clear to me that this was a very unglamorous and unpleasant aspect of the social media industries and no one involved was likely in a rush to discuss it. As I interrogated my colleagues, I realized that many of them, once they were given over to think about it at all, immediately assumed that moderation tasks of UGC must be automated. In other words, “Don’t computers/machines/robots do that?”

The New York City Tourism Association Thanks You for Visiting the Apple Store

People drop things on the Internet and run all the time. So we have to ask. In this edition, Slate Assistant Editor Miriam Krule tells us more about intergenerational information transfer at the Apple store in Grand Central Terminal.

Working from Apple store in Grand Central where a teen is teaching old ladies how to use a comp. Tourist just walked by and took photos.

— Miriam Krule (@miriamkrule) October 8, 2014

Miriam! So what happened here?

I was heading to Connecticut to celebrate the festival of huts out in the wilderness. My ride had fallen through, so I was taking the train, but the only one that worked for various boring logistical reasons was essentially midday. My parents live in New Jersey, so even though I grew up in New York, I’ve spent very little time in Grand Central Station and didn’t exactly think things through, figuring I could work from there in the morning. I found a nice quiet corner, only to realize that there’s no magical train station Wi-Fi (coincidentally this was as news of “Wi-Fi is a human right” was blowing up). Just as I was about to cave and pay for it (aka, look for a Starbucks), I saw an Apple Wi-Fi option and basically searched for a strong connection and ended up in the Apple Store, which I had no idea existed. (For future reference, it’s on this majestic balcony overlooking the main floor. Also, it’s impossible to miss.)

The Shopping Games

Eat Spinach, Not Kale

Recently, the general public, especially younger people in the cities, have begun to embrace strong flavors previously thought of as icky, like bitterness, fermentation, funk, fat and umami, which are now all prized flavors. This is good. But Americans, as always, are unable to do anything in moderation, and, hypnotized by the constant racket of food television, food blogs, restaurant blogs, and have-you-tried-this, insist that if strong flavors can be good, then even stronger flavors must be better. This is why we can’t have a hoppy IPA; we have to have the hoppiest quadruple-IPA science can concoct. We can’t have a normal bowl of chili; we have to bump up the savory flavor with umami-heavy ingredients like marmite, soy sauce, and anchovies, and who cares if those flavors work together? And we can’t use spinach anymore, because there are greens that are stronger and more bitter, and thus better, like kale. Eating spinach is something your parents would do. Eating kale—stringy, bitter, aggressive kale—is the mark of an adventurous, flavor-forward connoisseur.

Intern Deluxe: The Rise of New Media Fellowships

My first unpaid media internship was in the summer of 2010. Like most college students, previous semesters spent whiffing on applications made landing one feel like a reward, regardless of pay—I’d move to New York and even have the chance to write (mostly) professionally. The “unpaid” part always loomed, but my friends and I made it work through varying levels of cost-cutting and couch-crashing. Besides, we were all believers in that age-old internship axiom: As stressful as working for free was, we’d be getting the experience and exposure needed to compete for real, paid jobs. The problem with “climbing up to minimum wage” as an employment strategy never really crossed our minds.

Unpaid internships, long a due-paying rite of passage for college students, became entrenched as a stopgap solution for employers with spots to fill but without the money to properly fill them.  This was (and is) very bad. In cases where full-time work was carried out under the auspices of internship programs, it was also illegal. And, as the ways that many unpaid internships violated labor laws became common knowledge, former interns began taking their employers to court.  The earliest lawsuits, filed around late 2011, challenged the argument that interns weren’t technically employees and didn’t qualify for protections like minimum wage because they were getting educational or professional benefits by being in the office.  After a federal judge ruled that Fox Searchlight Pictures was illegally using unpaid interns on the movie Black Swan in June 2013—the first major ruling against unpaid internships—a wave of lawsuits followed against media companies like Conde Nast, NBC Universal and Gawker Media. (A similar case against the Hearst Corporation, filed in 2012, is currently under appeal.)

The media industry adapted swiftly: Slate began paying its interns in December 2013; Conde Nast shuttered its intern program entirely; and the Times ended its sub-minimum wage internships in March. But other high-profile employers have turned to a new way to temporarily employ students or recent grads: fellowships.

Do Not Roast the Squash

Trends and memes may be on the side of fall and winter squash—I dare you to find a single vendor without some variety of pumpkin foodstuff between September and December—but I rue the transition from light, delicate, and fresh summer squash, like zucchini, to heavy, sugary, and starchy winter squash, like acorn, pumpkin, delicata, butternut, and, of course, pumpkin. The most common way to eat winter squash, the one I see at potlucks and on restaurant menus alike, is actually the worst: a simple PC&R (peel, cube, and roast).

This is a very good way to cook almost any vegetable, but a bad way to cook winter squash. Summer squashes are typically eaten young, while the seeds and skins are still soft and edible—even raw—while winter squashes have been allowed to grow to a mature stage, so they are hardier; their flesh is dense and sweet and their skin tough and sometimes warty. This makes them very resistant to winter temperatures, but their texture makes people think they can be treated like potatoes or sweet potatoes, with a PC&R. Nope.

I have tried every possible way to PC&R winter squash: I have par-boiled; I have sous-vided; I have covered in aluminum foil; I have experimented with every possible temperature and timing and size and shape and amount of oil. My final conclusion is that there is no good way to PC&R a butternut squash or pumpkin. The pleasure of a roasted starchy vegetable is in the crispy exterior and pillowy interior, but this does not happen to winter squash—the only thing it does well in the oven is turn to mush.

This is all not to say that there are no good ways to eat winter squashes. That very tendency to turn to mush can be embraced. The squash is mush. Let it be mush. This means transforming it into soups, sauces, and purees, where the winter squash’s mushiness and heaviness become creaminess and richness. Here’s how to cook them properly.

Perfume Genius’s Tacoma Sadcore

I once drove Bea Arthur to a radio interview in my Honda Civic, and reveled in the fact that I had her (good) ear for forty-five minutes. She didn’t appreciate it when I asked if she had been part of vaudeville; apparently my years were way off.

I opted out of personally driving the celebrity I was interviewing this time, a musician who some would argue is just as gender-confounding as Ms. Arthur. I selected an UberBlack (that’s their “high-end sedan”) to drive me and Mike Hadreas AKA Perfume Genius, to the Chateau Marmont, the most cliché celebrity interview spot in Los Angeles. Something about placing an unassuming homegrown artist like Hadreas in that absurd environment appealed to me. It didn’t fit Hadreas, but it might one day. Last week, he made his first appearance on Letterman, performing his hit single “Queen,” which Slate named the gay anthem of the year. (I also had him make a Grindr profile, above.)

Hadreas asked if we would see Lana Del Rey at the Chateau; she had just played at the Hollywood Forever Cemetery two nights earlier. I mentioned that Del Rey’s music has been referred to as “Hollywood Sadcore,” which one MTV journalist described as “what you get when you cross a woman who looks like a ’60s Playboy bunny with a song that sounds a little bit like Chris Isaak’s ‘Wicked Games’ sung through a PJ Harvey/Lykke Li filter…” How might one describe Hadreas? Perhaps what you get when cross a man who looks like a boy who dresses like a female executive with songs that sound like longing, despair, and, most recently, power. That’s Perfume Genius’s Tacoma Sadcore.

You live in Washington State but you chose to record Too Bright in Bristol, England. You also love British musicians like Kate Bush and PJ Harvey. What is it about Britain?

Shot Through the Heart

Last Thursday night, the governor of New York State and the mayor of New York City announced that the first case of Ebola had been diagnosed at Bellevue Hospital. The man—a doctor who had recently returned from treating Ebola patients in West Africa—had fallen ill that morning, after a night of bowling in Williamsburg, they said. I live in Greenpoint, less than a mile away from the bowling alley he had been in just twenty-four hours earlier.

Hearing this struck fear in my heart. Not because I thought there was any real risk of me getting Ebola: I trusted the information the CDC reported, that Ebola can only be contracted from a person with active symptoms, and even in cases of a very sick person coming in casual contact with me, it would be relatively hard to contract Ebola. I am a fairly pragmatic person, capable of talking myself through the logical ends of various what-if scenarios. I have faith in modern medicine.

The fear wasn’t about me, though: It was for my nine-month-old daughter. The what-if scenarios, though only momentary, were extreme. For just one second, it seemed absolutely certain to me that she would somehow, devastatingly skirt the odds and come down with Ebola.

A thing I have learned about myself-as-parent: When my child is involved, it takes some extra arguing with my brain for rationality to prevail.

How Amazon Solved the Problem of Work

On Wednesday, October 8th, the Supreme Court heard oral arguments in the case of Integrity Staffing Solutions v. Busk. The case pits warehouse workers Jesse Busk and Laurie Castro against their former employer. The issue at hand is time: Should minutes spent waiting to be screened at the end of the workday—Integrity manages warehouses that fulfill online shopping orders—be counted as work? If so, then shouldn’t workers be paid?

Supreme Court cases that feel ethically simple are often legally complicated; similarly, cases that make it that far and yet appear legally tidy are often ethically difficult. This case seems to fall into the former category: you have decades of opaque labor legislation through which the definition of work must be read and in the shadow of which it must be revised; you also have a specific situation in which workers reach the end of their shifts and are then effectively detained at their workplaces for up to 25 minutes, without pay, in order to be checked for stolen merchandise.

One way to understand this post-work/pre-departure limbo is in terms of incentives: If this time counted as work, it would cost Integrity Staffing Solutions a lot of money, so Integrity Staffing Solutions would be motivated to minimize it. But if this extra time doesn’t count as work, there is no direct incentive to fix anything. In that situation, Integrity’s objectives are to make sure workers aren’t stealing merchandise, and to do so at the minimum possible cost. It does not need to worry about workers’ time, because that time, which is valuable to Integrity’s efforts to prevent theft, costs them virtually nothing. Meanwhile, the value of this time to the employees has not changed. They’re not home. They’re not at their other jobs. They’re not seeing friends. They are, as far as everyone else in their lives is concerned, still at work.