Shot Through the Heart

Last Thursday night, the governor of New York State and the mayor of New York City announced that the first case of Ebola had been diagnosed at Bellevue Hospital. The man—a doctor who had recently returned from treating Ebola patients in West Africa—had fallen ill that morning, after a night of bowling in Williamsburg, they said. I live in Greenpoint, less than a mile away from the bowling alley he had been in just twenty-four hours earlier.

Hearing this struck fear in my heart. Not because I thought there was any real risk of me getting Ebola: I trusted the information the CDC reported, that Ebola can only be contracted from a person with active symptoms, and even in cases of a very sick person coming in casual contact with me, it would be relatively hard to contract Ebola. I am a fairly pragmatic person, capable of talking myself through the logical ends of various what-if scenarios. I have faith in modern medicine.

The fear wasn’t about me, though: It was for my nine-month-old daughter. The what-if scenarios, though only momentary, were extreme. For just one second, it seemed absolutely certain to me that she would somehow, devastatingly skirt the odds and come down with Ebola.

A thing I have learned about myself-as-parent: When my child is involved, it takes some extra arguing with my brain for rationality to prevail.

How Amazon Solved the Problem of Work

On Wednesday, October 8th, the Supreme Court heard oral arguments in the case of Integrity Staffing Solutions v. Busk. The case pits warehouse workers Jesse Busk and Laurie Castro against their former employer. The issue at hand is time: Should minutes spent waiting to be screened at the end of the workday—Integrity manages warehouses that fulfill online shopping orders—be counted as work? If so, then shouldn’t workers be paid?

Supreme Court cases that feel ethically simple are often legally complicated; similarly, cases that make it that far and yet appear legally tidy are often ethically difficult. This case seems to fall into the former category: you have decades of opaque labor legislation through which the definition of work must be read and in the shadow of which it must be revised; you also have a specific situation in which workers reach the end of their shifts and are then effectively detained at their workplaces for up to 25 minutes, without pay, in order to be checked for stolen merchandise.

One way to understand this post-work/pre-departure limbo is in terms of incentives: If this time counted as work, it would cost Integrity Staffing Solutions a lot of money, so Integrity Staffing Solutions would be motivated to minimize it. But if this extra time doesn’t count as work, there is no direct incentive to fix anything. In that situation, Integrity’s objectives are to make sure workers aren’t stealing merchandise, and to do so at the minimum possible cost. It does not need to worry about workers’ time, because that time, which is valuable to Integrity’s efforts to prevent theft, costs them virtually nothing. Meanwhile, the value of this time to the employees has not changed. They’re not home. They’re not at their other jobs. They’re not seeing friends. They are, as far as everyone else in their lives is concerned, still at work.

Hotels Are People Too

It’s increasingly hard to escape the sensation that the primary proprietors of the so-called sharing economy don’t so much share as take—from their users, from their contracted workers, from the localities in which they operate, by utilizing infrastructure that they do not contribute toward. It’s everybody else who shares.

The New York State Attorney General’s initial report on Airbnb in New York City, which analyzed full-apartment bookings (crucially, not room shares) with the service from 2010 until this past June, feels fairly conclusive in this regard. Even if you absolutely do not care at all that, according to the attorney general, seventy-two percent of the private bookings on Airbnb are technically illegal, or that real hotel operators are losing out hundreds of millions of dollars in bookings, or even maybe that the city has lost tens of millions of dollars in taxes the city has lost to Airbnb and its hosts, it’s frankly easy, as a renter in New York City (I mean, Jesus) to feel supremely agitated that last year, more than four-and-a-half thousand apartments listed on Airbnb were booked for short-term rentals for three months of the year or more, and of those, nearly half were booked by half the year or more—meaning apartments that could and should have been on the market were being largely used as hotels. (These apartments accounted for thirty-eight percent of the revenue to Airbnb and its hosts from units booked as private short-term rentals, according to the attorney general.)

Provocations of a Bad Jingle Writer

It’s Thursday afternoon in late August. I am recording a dismal power-metal jingle for CBS Sports and the NFL. Football: a sport that should have died 65 million years ago. To record this jingle, I am using my iPhone’s GarageBand app. This isn’t composing; this is clicking. I am assembling a loop of sludgy, charmless instrumental samples. “Dark and Heavy Riff 06.” “Indie Rock Riffing 02.” “Double Punk Drumset 01.” I am 30 years old, and a songwriter. A singer-songwriter. Multi-hyphenate. But since my music is virtually unknown outside a narrow circle of Chicagoans and South American women, and since there’s about five thousand dollars left in the entire music industry, I’m also a composer for advertising.

I freelance for three agencies. Every week or so, I get an email from a music supervisor. It will start with: “we have been tapped to find the just-right song” or “we have a new spot that needs some rad music.” It will end with: “we need this in two days.” There will be a brief description of the commercial or, if I’m lucky, an attached script. Sometimes the client or advertising agency will be named. Occasionally the client will be ambiguous. A “big box retailer.” An “automotive company.” In the early stages of an advertising campaign, either the brand, the ad agency, or, more often, the director will become eye-wateringly fixated on a pop song. This song will be used temporarily while filming. However, usually for budgetary or ego reasons, it will be unlicensable. So, a knockoff version is requested. That’s when a music agency is contacted, and I receive an email. I’m often told the music should be “almost exact to the references.” At best, this is a creative process lacking creativity. At worst, it’s plagiarism.

I’m not always asked to steal melodies from contemporary songs. Sometimes a music supervisor will indicate light creative freedom. It’s like finding a few inches of space in a feedlot. In these rare moments, the music brief will say: “looking for songs that are heartwarming in a folk/pop way” or “looking for something upbeat and happy.” Empty descriptions. Once these original, or orginalish, songs are submitted, the client will request changes. “Good start, but we dig this new Black Keys song. Can we get something almost similar to that?” For Redd’s Apple Ale, I submitted several songs from my own record, Delicate Parts. My lyrics were “too challenging.” The client also wanted the word “Red” in the lyrical hook. So my words and voice–everything essential and human and exclamatory–were removed from the mix. Throwaway lines jammed with “red” were dashed off. The songs were edited into 30-second clips and a female singer recorded over them. My music became part karaoke, part evisceration. And I permitted it.

How did this happen? How did I become a jingle man?

The Internet’s Invisible Sin-Eaters

In this month’s Wired, Adrian Chen visits the Philippines to speak with professional content moderators—the people who scrub all the dick pics and beheadings from the world’s biggest sites before they reach users’ eyes. It’s job that, he says, “might very well comprise as much as half the total workforce for social media sites.” Sarah Roberts, a media studies scholar at the University of Western Ontario focusing on commercial content moderation, is quoted in the piece. They caught up over chat.

AC: One thing I would have liked to include in my piece was how you got interested in studying content moderation.

SR: Well, it’s a pretty simple story. I was perusing the NYT one day and there was a very small story in the Tech section about workers in rural Iowa who were doing this content screening job. They were doing it for low wages, essentially as contractors in a call center in a place that, a couple generations ago, was populated by family farms. I call it “Farm Aid Country.” I say this as a born and raised Wisconsinite, from right next door.

So this was a pretty small piece, but it really hit me. The workers at this call center, and others like it, were looking at very troubling user-generated content (UGC) day in and day out. It was taking a toll on them psychologically, in some cases. I should say that I’ve been online for a long time (over twenty years) and, at the time I read this, was working on my Ph.D. in digital information studies. I was surrounded at all times by really smart internet geeks and scholars. So I started asking my peers and professors, “Hey, have you ever heard of this practice?” To my surprise, no one—no one-had.

This was in the summer of 2010. Right there, I knew that it wasn’t simple coincidence that no one had heard of it. It was clear to me that this was a very unglamorous and unpleasant aspect of the social media industries and no one involved was likely in a rush to discuss it. As I interrogated my colleagues, I realized that many of them, once they were given over to think about it at all, immediately assumed that moderation tasks of UGC must be automated. In other words, “Don’t computers/machines/robots do that?”

The New York City Tourism Association Thanks You for Visiting the Apple Store

People drop things on the Internet and run all the time. So we have to ask. In this edition, Slate Assistant Editor Miriam Krule tells us more about intergenerational information transfer at the Apple store in Grand Central Terminal.

Working from Apple store in Grand Central where a teen is teaching old ladies how to use a comp. Tourist just walked by and took photos.

— Miriam Krule (@miriamkrule) October 8, 2014

Miriam! So what happened here?

I was heading to Connecticut to celebrate the festival of huts out in the wilderness. My ride had fallen through, so I was taking the train, but the only one that worked for various boring logistical reasons was essentially midday. My parents live in New Jersey, so even though I grew up in New York, I’ve spent very little time in Grand Central Station and didn’t exactly think things through, figuring I could work from there in the morning. I found a nice quiet corner, only to realize that there’s no magical train station Wi-Fi (coincidentally this was as news of “Wi-Fi is a human right” was blowing up). Just as I was about to cave and pay for it (aka, look for a Starbucks), I saw an Apple Wi-Fi option and basically searched for a strong connection and ended up in the Apple Store, which I had no idea existed. (For future reference, it’s on this majestic balcony overlooking the main floor. Also, it’s impossible to miss.)

Why My Baby Doesn’t Eat Animals

Having a child means that you, as a parent, wield incredible power. You can dress your baby exclusively in green, or never let her hear Simon & Garfunkel (as if) or Iggy Azalea (oops, I wish). Arguably the greatest power arrives with the introduction of “solid food” into your baby’s mouth, around the time they are six months old. I thought for a very long time, even talking it over with friends, about what Zelda’s first food should be. I was told by my doctor to start with something naturally mushy. I settled on a daily vacillation between the avocado and the banana.

Zelda didn’t want to wait until she was six months old. By the time she was four-and-a-half months old, she was trying to grab food from my hands, or off of my plate. So, one afternoon, in a less momentous fashion than I had imagined, I mashed up both an avocado and a banana and offered them to her, minutes apart. She took the spoon from me and hoisted it into her mouth herself. She made a face, but she was also “chewing” as she handed the spoon back to me for a refill. A lot of what I gave her on the spoon fell out of her mouth and onto the floor, where the dog was anxiously waiting. But Zelda clearly understood the ritual: The next day, when I fed her sweet potato which I had peeled, steamed, and pureed, more went in—and stayed in. In less than a week, she’d been introduced to green beans, peas, carrots, and leeks (which I steamed with a small piece of potato and pureed for her).

Now, at eight months old, with just two teeth, Zelda can chomp down anything you hand over, in smallish chunks. She likes her food pureed or not, warm or not. Toast, strawberries, steamed broccoli, pasta noodles. She eats a lot, usually feeding herself, and often sharing with the dog. The one thing Zelda has never tasted, however, is an animal.

The Ten-Year Anniversary of the Time My Wedding Announcement Was Not Accepted by the Paper of Record

Margery Miller and Dan Shanoff

Margery Ilana Miller and Daniel Shanoff are to be married this evening1 at The Plantation at Amelia Island, Florida. Rabbi David Kaiman is to officiate.2

1. Ten years ago today. You see: This notice was submitted to the New York Times for inclusion in its Sunday Styles “Weddings” sub-section for October 3, 2004. After not hearing anything for weeks/months leading up to the scheduled day, I opened the paper that morning earnestly hoping for the best, but instead receiving a wedding present of inexplicable rejection, which is clearly an off-registry gift.

2. So nice!

Mrs. Shanoff, 30, is a third-year law student at Fordham, where she is a Senior Notes and Articles Editor of the Law Review. She will begin working next fall as an associate at the law firm of Davis, Polk and Wardwell3 in New York.4 She graduated magna cum laude from Harvard.

3. After clerking for a judge (a NYT Wedding staple detail), she eventually left the firm for a smaller one, then left that firm for a quasi-governmental regulatory group. Still a lawyer, though.

4. If we didn’t live in New York at the time we were getting married, I wouldn’t have even bothered submitting the announcement. We eventually moved out, as so many couples who make the Weddings cut inevitably do, because of kids, exhaustion or a combination.

Intern Deluxe: The Rise of New Media Fellowships

My first unpaid media internship was in the summer of 2010. Like most college students, previous semesters spent whiffing on applications made landing one feel like a reward, regardless of pay—I’d move to New York and even have the chance to write (mostly) professionally. The “unpaid” part always loomed, but my friends and I made it work through varying levels of cost-cutting and couch-crashing. Besides, we were all believers in that age-old internship axiom: As stressful as working for free was, we’d be getting the experience and exposure needed to compete for real, paid jobs. The problem with “climbing up to minimum wage” as an employment strategy never really crossed our minds.

Unpaid internships, long a due-paying rite of passage for college students, became entrenched as a stopgap solution for employers with spots to fill but without the money to properly fill them.  This was (and is) very bad. In cases where full-time work was carried out under the auspices of internship programs, it was also illegal. And, as the ways that many unpaid internships violated labor laws became common knowledge, former interns began taking their employers to court.  The earliest lawsuits, filed around late 2011, challenged the argument that interns weren’t technically employees and didn’t qualify for protections like minimum wage because they were getting educational or professional benefits by being in the office.  After a federal judge ruled that Fox Searchlight Pictures was illegally using unpaid interns on the movie Black Swan in June 2013—the first major ruling against unpaid internships—a wave of lawsuits followed against media companies like Conde Nast, NBC Universal and Gawker Media. (A similar case against the Hearst Corporation, filed in 2012, is currently under appeal.)

The media industry adapted swiftly: Slate began paying its interns in December 2013; Conde Nast shuttered its intern program entirely; and the Times ended its sub-minimum wage internships in March. But other high-profile employers have turned to a new way to temporarily employ students or recent grads: fellowships.

Do Not Roast the Squash

Trends and memes may be on the side of fall and winter squash—I dare you to find a single vendor without some variety of pumpkin foodstuff between September and December—but I rue the transition from light, delicate, and fresh summer squash, like zucchini, to heavy, sugary, and starchy winter squash, like acorn, pumpkin, delicata, butternut, and, of course, pumpkin. The most common way to eat winter squash, the one I see at potlucks and on restaurant menus alike, is actually the worst: a simple PC&R (peel, cube, and roast).

This is a very good way to cook almost any vegetable, but a bad way to cook winter squash. Summer squashes are typically eaten young, while the seeds and skins are still soft and edible—even raw—while winter squashes have been allowed to grow to a mature stage, so they are hardier; their flesh is dense and sweet and their skin tough and sometimes warty. This makes them very resistant to winter temperatures, but their texture makes people think they can be treated like potatoes or sweet potatoes, with a PC&R. Nope.

I have tried every possible way to PC&R winter squash: I have par-boiled; I have sous-vided; I have covered in aluminum foil; I have experimented with every possible temperature and timing and size and shape and amount of oil. My final conclusion is that there is no good way to PC&R a butternut squash or pumpkin. The pleasure of a roasted starchy vegetable is in the crispy exterior and pillowy interior, but this does not happen to winter squash—the only thing it does well in the oven is turn to mush.

This is all not to say that there are no good ways to eat winter squashes. That very tendency to turn to mush can be embraced. The squash is mush. Let it be mush. This means transforming it into soups, sauces, and purees, where the winter squash’s mushiness and heaviness become creaminess and richness. Here’s how to cook them properly.

A Field Guide to the True American Diner

Hello, I am an American from New Jersey and I care about diners.

The True American Diner is a casual sit-down restaurant that serves breakfast, lunch, and dinner—all three meals—all day, often for all twenty-four hours of it. Time has no meaning in the presence of eggs, steak and hash browns. Portions are large but not obscene; sides are available with nearly everything. The food is sturdy and simple, a few strong flavors and techniques. Nothing in a True American Diner couldn’t be made by a moderately skilled cook in their own kitchen: corned beef hash, club sandwiches, and a variety of scrambles.

Menus are oversized and presented as a single, huge laminated page with unavailable items taped over, or in a leather-bound binder. Everything in the “diet” section of the menu contains cottage cheese or is steamed. There are daily specials, and they come with soup or salad. Chicken Parmesan and mozzarella sticks must be available. Ketchup is served in bottles, not packets. The coffee is available and drunk at every meal; cups may even be set out on the table before patrons arrive. Refills are free and assumed to be always wanted, unless you indicate you want no more by turning the coffee cup over. Dessert is pie, and if displayed in a glass case at the end of the counter, it must rotate. We did not free ourselves from England’s cruel yoke to have static pie.