Whenever I have an idea for something funny to write on the Internet, I have to make sure that it isn’t just something I’ve subconsciously ripped off from writer/webmistress Mallory Ortberg. If there is a joke to be made about anything, chances are Mallory’s already made it, in a both subtle and absurd way that will seep into your brain and stick with you for months.
On November 4th, Henry Holt is publishing Texts from Jane Eyre—a collection based on the series in which Mallory sums up the entire canon of Western literature in a few textual exchanges with great accuracy and even greater lols. Believe it or not, Texts From was spawned on THIS VERY HERE SITE. Buy a copy, then read this interview. Or read this interview and buy a copy. Buy a copy, read this interview, then buy another copy for best results. Anything else you were planning to do today can wait. It was probably dumb anyway.
Mallory! Hi hi hi!
Are you READY? For some harrowing questions that will make you look deep within yourself? Let’s DO THIS. I’m ready to get controversial.
Last Thursday night, the governor of New York State and the mayor of New York City announced that the first case of Ebola had been diagnosed at Bellevue Hospital. The man—a doctor who had recently returned from treating Ebola patients in West Africa—had fallen ill that morning, after a night of bowling in Williamsburg, they said. I live in Greenpoint, less than a mile away from the bowling alley he had been in just twenty-four hours earlier.
Hearing this struck fear in my heart. Not because I thought there was any real risk of me getting Ebola: I trusted the information the CDC reported, that Ebola can only be contracted from a person with active symptoms, and even in cases of a very sick person coming in casual contact with me, it would be relatively hard to contract Ebola. I am a fairly pragmatic person, capable of talking myself through the logical ends of various what-if scenarios. I have faith in modern medicine.
The fear wasn’t about me, though: It was for my nine-month-old daughter. The what-if scenarios, though only momentary, were extreme. For just one second, it seemed absolutely certain to me that she would somehow, devastatingly skirt the odds and come down with Ebola.
A thing I have learned about myself-as-parent: When my child is involved, it takes some extra arguing with my brain for rationality to prevail.
In this month’s Wired, Adrian Chen visits the Philippines to speak with professional content moderators—the people who scrub all the dick pics and beheadings from the world’s biggest sites before they reach users’ eyes. It’s job that, he says, “might very well comprise as much as half the total workforce for social media sites.” Sarah Roberts, a media studies scholar at the University of Western Ontario focusing on commercial content moderation, is quoted in the piece. They caught up over chat.
AC: One thing I would have liked to include in my piece was how you got interested in studying content moderation.
SR: Well, it’s a pretty simple story. I was perusing the NYT one day and there was a very small story in the Tech section about workers in rural Iowa who were doing this content screening job. They were doing it for low wages, essentially as contractors in a call center in a place that, a couple generations ago, was populated by family farms. I call it “Farm Aid Country.” I say this as a born and raised Wisconsinite, from right next door.
So this was a pretty small piece, but it really hit me. The workers at this call center, and others like it, were looking at very troubling user-generated content (UGC) day in and day out. It was taking a toll on them psychologically, in some cases. I should say that I’ve been online for a long time (over twenty years) and, at the time I read this, was working on my Ph.D. in digital information studies. I was surrounded at all times by really smart internet geeks and scholars. So I started asking my peers and professors, “Hey, have you ever heard of this practice?” To my surprise, no one—no one-had.
This was in the summer of 2010. Right there, I knew that it wasn’t simple coincidence that no one had heard of it. It was clear to me that this was a very unglamorous and unpleasant aspect of the social media industries and no one involved was likely in a rush to discuss it. As I interrogated my colleagues, I realized that many of them, once they were given over to think about it at all, immediately assumed that moderation tasks of UGC must be automated. In other words, “Don’t computers/machines/robots do that?”
JEN VAFIDIS: HI JANE. There is a new Taylor Swift album out today, and it is already totally undeniable. The first single is a #1 hit, the second single was #1 on iTunes within 10 minutes of its release, and Taylor has been teasing us via Instagram about these new songs for what seems like years. It’s only been a few weeks, but still. I love her, you love her, let’s talk about her.
JANE HU: When I tell people that 1989 is going to get me through the rest of 2014, I’m 100% not exaggerating. Even though the three pre-releases have really sent some MIXED SIGNALS about the feel of the album, T-Swift has never let me down before. I adore this album, but the leading track actually had me a little worried for a moment!
VAF: I hate the first song on this album, and I have a feeling you also don’t love it. But maybe I am wrong?
My first unpaid media internship was in the summer of 2010. Like most college students, previous semesters spent whiffing on applications made landing one feel like a reward, regardless of pay—I’d move to New York and even have the chance to write (mostly) professionally. The “unpaid” part always loomed, but my friends and I made it work through varying levels of cost-cutting and couch-crashing. Besides, we were all believers in that age-old internship axiom: As stressful as working for free was, we’d be getting the experience and exposure needed to compete for real, paid jobs. The problem with “climbing up to minimum wage” as an employment strategy never really crossed our minds.
Unpaid internships, long a due-paying rite of passage for college students, became entrenched as a stopgap solution for employers with spots to fill but without the money to properly fill them. This was (and is) very bad. In cases where full-time work was carried out under the auspices of internship programs, it was also illegal. And, as the ways that many unpaid internships violated labor laws became common knowledge, former interns began taking their employers to court. The earliest lawsuits, filed around late 2011, challenged the argument that interns weren’t technically employees and didn’t qualify for protections like minimum wage because they were getting educational or professional benefits by being in the office. After a federal judge ruled that Fox Searchlight Pictures was illegally using unpaid interns on the movie Black Swan in June 2013—the first major ruling against unpaid internships—a wave of lawsuits followed against media companies like Conde Nast, NBC Universal and Gawker Media. (A similar case against the Hearst Corporation, filed in 2012, is currently under appeal.)
The media industry adapted swiftly: Slate began paying its interns in December 2013; Conde Nast shuttered its intern program entirely; and the Times ended its sub-minimum wage internships in March. But other high-profile employers have turned to a new way to temporarily employ students or recent grads: fellowships.
Originally published April 30, 2012.
I am a 30-year-old woman with an arts degree and some geographic commitment issues, so for much of my adult life, I’ve been in situations where I’ve earned unimpressive amounts of money, but have needed (or wanted) to fly to places semi-regularly. As a result, I’ve become a sort of unabashed, salivating fangirl for airline miles, and something of an expert when it comes to accumulating them. I offer here a primer on how you might join me in this rewarding hobby.
Not to be a scold right off the bat, but this method involves credit cards, so it may not be for everyone. You’ll need to have good credit, and pretty high levels of self-discipline for it to work right. If you’re the type who sees access to credit as an invitation to spend recklessly, I’m sorry, but this is not for you. You know that show on TLC about “Extreme Couponing” that is both inspiring and repulsive and you don’t know whether to pity the couponers or to cheer them on? This advice is going to be kind of like that, but for airline miles, so if you’re squeamish, don’t read any further.
The first ghost story I ever heard was from my mother. She described how once, while sleeping in an upstairs bedroom in her sister’s house, she woke to the feeling of twin icicles curling around her ankles. They were hands, but she didn’t see a body, exactly. More like an abstract interpretation of a body, female, crouched at the foot of the bed. It yanked once, hard, and she opened her pink teenaged mouth and screamed, causing it to let go and vanish. The details shift uneasily when she retells this story—sometimes there is a horrible, unseasonal rainstorm beating the roof, sometimes she is 15, or 17. But these two details remain the same: The bed belonged a dead woman and she never went into that portion of the house again.
There’s a lot of paranormal activity in my family. Whether it is more than most other families is hard to say, but we seem to have more than most. During holidays and family events, after the adults wander into the kitchen to drink coffee or head off to bed, us cousins gather in some remote part of the house and talk about the things that go bump in the night. These are our heirlooms, a series of signals and omens that help us make sense of each other and our shared family history, which is by turns strange, mysterious and murky. These stories open up a portal to the parts of life that don’t seem to make much sense but as still just as real as the rest of it. Over the years, I’ve come to realize that sometimes a ghost isn’t always a ghost. Sometimes, telling a ghost story is a way to talk about something else present in the air, taking up space beside you. It can also be a manifestation of intuition, or something you’ve known in your bones but haven’t yet been able to accept. But sometimes a ghost is exactly what it is—a seriously fucking scary spirit.
On Wednesday, October 8th, the Supreme Court heard oral arguments in the case of Integrity Staffing Solutions v. Busk. The case pits warehouse workers Jesse Busk and Laurie Castro against their former employer. The issue at hand is time: Should minutes spent waiting to be screened at the end of the workday—Integrity manages warehouses that fulfill online shopping orders—be counted as work? If so, then shouldn’t workers be paid?
Supreme Court cases that feel ethically simple are often legally complicated; similarly, cases that make it that far and yet appear legally tidy are often ethically difficult. This case seems to fall into the former category: you have decades of opaque labor legislation through which the definition of work must be read and in the shadow of which it must be revised; you also have a specific situation in which workers reach the end of their shifts and are then effectively detained at their workplaces for up to 25 minutes, without pay, in order to be checked for stolen merchandise.
One way to understand this post-work/pre-departure limbo is in terms of incentives: If this time counted as work, it would cost Integrity Staffing Solutions a lot of money, so Integrity Staffing Solutions would be motivated to minimize it. But if this extra time doesn’t count as work, there is no direct incentive to fix anything. In that situation, Integrity’s objectives are to make sure workers aren’t stealing merchandise, and to do so at the minimum possible cost. It does not need to worry about workers’ time, because that time, which is valuable to Integrity’s efforts to prevent theft, costs them virtually nothing. Meanwhile, the value of this time to the employees has not changed. They’re not home. They’re not at their other jobs. They’re not seeing friends. They are, as far as everyone else in their lives is concerned, still at work.
Transparent, Amazon’s foray into the Netflix-infested waters of quality internet binge watching, is deservedly the most critically-lauded show of this Fall television season (and was just renewed for a second season). Created by writer/director Jill Soloway (writer/producer Six Feet Under and The United States of Tara, writer/director Afternoon Delight, which won a directing award at Sundance in 2013), the show centers around the Pfefferman family, an affluent Jewish LA clan whose patriarch Mort (Jeffrey Tambor) comes out as transgender and begins to live as Maura in her late 60s.
Directed mostly by Soloway herself, (with the exception of three, credited to Nisha Ganatra), the direction in the show is strong and incredibly consistent, marked by what Emily Nussbaum refers to in her piece on the show in The New Yorker as “mildly funky pacing” of the current era of indie film/TV direction stylistic crossovers we are seeing particularly in comedy, with shows like Girls and Louie. However, a key difference between Transparent and those other shows is that Soloway is not a character, neither in physical or representational form. Rather, Soloway knows all of her characters extremely well, she knows them like family, and in the way one knows family, she allows them to speak for themselves and expose their own flaws. She is not at all precious about her characters and at times early in the series she can be downright misanthropic, allowing the whole ensemble (minus the consistently heartbreaking, inspiring, astonishing Maura) to tread deeper and deeper toward the brink of unlikability.
People drop things on the Internet and run all the time. So we have to ask. In this edition, Slate Assistant Editor Miriam Krule tells us more about intergenerational information transfer at the Apple store in Grand Central Terminal.
Working from Apple store in Grand Central where a teen is teaching old ladies how to use a comp. Tourist just walked by and took photos.
— Miriam Krule (@miriamkrule) October 8, 2014
Miriam! So what happened here?
I was heading to Connecticut to celebrate the festival of huts out in the wilderness. My ride had fallen through, so I was taking the train, but the only one that worked for various boring logistical reasons was essentially midday. My parents live in New Jersey, so even though I grew up in New York, I’ve spent very little time in Grand Central Station and didn’t exactly think things through, figuring I could work from there in the morning. I found a nice quiet corner, only to realize that there’s no magical train station Wi-Fi (coincidentally this was as news of “Wi-Fi is a human right” was blowing up). Just as I was about to cave and pay for it (aka, look for a Starbucks), I saw an Apple Wi-Fi option and basically searched for a strong connection and ended up in the Apple Store, which I had no idea existed. (For future reference, it’s on this majestic balcony overlooking the main floor. Also, it’s impossible to miss.)
Trends and memes may be on the side of fall and winter squash—I dare you to find a single vendor without some variety of pumpkin foodstuff between September and December—but I rue the transition from light, delicate, and fresh summer squash, like zucchini, to heavy, sugary, and starchy winter squash, like acorn, pumpkin, delicata, butternut, and, of course, pumpkin. The most common way to eat winter squash, the one I see at potlucks and on restaurant menus alike, is actually the worst: a simple PC&R (peel, cube, and roast).
This is a very good way to cook almost any vegetable, but a bad way to cook winter squash. Summer squashes are typically eaten young, while the seeds and skins are still soft and edible—even raw—while winter squashes have been allowed to grow to a mature stage, so they are hardier; their flesh is dense and sweet and their skin tough and sometimes warty. This makes them very resistant to winter temperatures, but their texture makes people think they can be treated like potatoes or sweet potatoes, with a PC&R. Nope.
I have tried every possible way to PC&R winter squash: I have par-boiled; I have sous-vided; I have covered in aluminum foil; I have experimented with every possible temperature and timing and size and shape and amount of oil. My final conclusion is that there is no good way to PC&R a butternut squash or pumpkin. The pleasure of a roasted starchy vegetable is in the crispy exterior and pillowy interior, but this does not happen to winter squash—the only thing it does well in the oven is turn to mush.
This is all not to say that there are no good ways to eat winter squashes. That very tendency to turn to mush can be embraced. The squash is mush. Let it be mush. This means transforming it into soups, sauces, and purees, where the winter squash’s mushiness and heaviness become creaminess and richness. Here’s how to cook them properly.