Julian Baggini writes in a thoughtful essay that high-end restaurants in the United Kingdom have thrown out the idea of “artisan” espresso and bought Nespresso machines, which use factory-sealed capsules of precision-ground coffee and can be operated with the push of a button. In fact, as Baggini discovered in a blind taste test, Nespresso is consistently better, or at least more consistently good, than “artisan” espresso made by hand. But, he asks, is a cup of coffee just a cup of coffee — just the momentary pleasure it gives us, a mere utilitarian instrument? Or is it something more — the sum of its relationships to other things?
Suppose you want to eat less processed food. Given how and what most Americans eat, that impulse is probably a good one. But once we go beyond the obvious (cheese curls, sugar cereal, hot dogs) you find yourself down the rabbit hole. What about that bottle of salad dressing you use to perk up your unprocessed salad? Is hot sauce ok? What boxed cereal can you eat? You start squinting over ingredients lists, blocking the grocery aisle with your empty cart. You accept an invitation to a potluck and sit horror-struck by the potential dangers lurking in the dishes, feeling your appetite slipping away like blue cheese dressing off a greasy wing. To bolster your flagging courage, you read endless blog posts about why the things you’ve given up are killing other people’s children. You develop an evangelical zeal, gnawed by the fear that your friends will make fun of you the moment you step out of the room. You begin to wonder if you should get new friends.
And then you throw up your hands and dive into a bag of Doritos.
Now, I am the last person to advocate eating most of the food available in American supermarkets. I make my own jam and pickles, I bake bread, I cook practically every meal from scratch, I shop at farmers’ markets. After twenty years of living and eating like this, industrially processed food no longer really tastes like food. Forget health concerns; it just isn’t particularly satisfying.
But having lived this way for twenty years — and having put a great deal of thought into it during that time, and having done a lot of research on how foods were historically prepared — I’m painfully aware that any notion of purity about this business is foolishness. Cooking is, after all, processing, and humans have been doing that for what, fifty thousand years? We’ve been grinding grain into meal for five thousand years, and we’ve been processing and selling food commercially (mainly as grain, oil, and spices) for probably four thousand. I can, if I try, justify the natural origins of practically any edible substance — or find fault with the freshest of fruits. (What the heck is “food-grade wax”?)
Obviously, any sane and sensible person is going to draw a line somewhere. But any line we draw will to some extent be arbitrary; any principle we set will inevitably include some things that seem thoroughly unnatural and exclude others we can’t manage without. I’m going to consider some possible standards, suggest an alternative that’s (you won’t be surprised to learn) largely historical, show how difficult it is to apply even that comparatively objective standard — and then draw some conclusions about navigating this mess sensibly. It’s a long piece, but hit-and-run easy answers are exactly what we need to avoid.
An article in last month’s National Geographic examines the loss of genetic diversity in the world’s crops, and this infographic, in particular, has been making the rounds of the Internet, at least in the corners where foodies and activists lurk. It shows the decline in diversity of common American garden vegetables between 1903 and 1983: more than 90 pecent of the varieties in existence at the turn of the twentieth century are now long gone. That loss of diversity has consequences beyond our inability to sample the flavor of a long-lost apple: with so little genetic stock available, changes in climate or a new disease might easily wipe out an entire crop, such as wheat, and we’d have no way to rebuild it.
It’s a lovely graphic, well designed and (if you aren’t already familiar with the issue) appropriately shocking. Like too many such graphics, though, this one doesn’t inspire much beyond despair. What can I, or anybody, do about it? The accompanying article gives the answer: I don’t have to do anything, because there are institutional “seed banks” working to preserve the genetic stock still remaining on the world’s farms. I’ve been shocked and then duly comforted; no need to get out of my reading chair. Let the experts handle it.
Except that this isn’t the right answer, or at least isn’t enough of one. Seed banks, valuable and worthwhile as they are, can only preserve the remaining — let’s say, as a round number — ten percent of the genetic diversity that once existed. But that ten percent is dangerously little. And institutions and experts can’t rebuild the remaining ninety percent, because they didn’t build it in the first place.
Doubtless some readers will have been puzzled yesterday by my use of scrapple as a model of purity. But, you know, there is scrapple, and then there is scrapple. There’s country scrapple and city scrapple, as the distinction used to be drawn, back when country people, or at least country butchers, still made their own. There was country panhaas, made by Pennsylvania Germans, and there was its bastard cousin Philadelphia scrapple.
My main thought on all this horror over “pink slime” is that it doesn’t sound any worse than any food-like product I’d expect to come out of a factory. I mean, what do you expect? The goal of the U.S. food industry is to produce substances that are chemically compatible with the maintenance of human life and that are aesthetically and culturally palatable to American consumers, all at the greatest possible margin of profit. Pink slime, duly flavored with extracts, shaped into a patty, topped with half the contents of the refrigerator and eaten by a model with juice dribbling down her chin, pretty much nails it.
But I get tired of reading only people I agree with, so I went looking for contrary arguments.
Mark Bittman writes in this Sunday’s New York Times (“Finally, Fake Chicken Worth Eating”) that he has decided, at last, to endorse fake meat, because he believes that Americans ought to eat less meat and because certain new soy- and mushroom-based fake meat products are, in certain circumstances, nearly indistinguishable from industrially produced chicken breast.
On its own, Brown’s “chicken” — produced to mimic boneless, skinless breast — looks like a decent imitation, and the way it shreds is amazing. It doesn’t taste much like chicken, but since most white meat chicken doesn’t taste like much anyway, that’s hardly a problem; both are about texture, chew and the ingredients you put on them or combine with them. When you take Brown’s product, cut it up and combine it with, say, chopped tomato and lettuce and mayonnaise with some seasoning in it, and wrap it in a burrito, you won’t know the difference between that and chicken.
Bittman’s uncritical acceptance of the way Americans consume chicken breast, moreover — which is to say, mechanically — is disappointing from a man who has done as much as anyone to teach Americans how to cook and eat real food in simple, practical ways. There’s no indication that the product tastes good, only that it isn’t terrible. Nor does it promoting it in this fashion aid the cause of good cooking or of thoughtful, intelligent consumption. To embrace the consumption of “meatlike stuff” produced by a “thingamajiggy” is, I believe, to embrace the error at the root of modern industrial agriculture, and therefore, in the long run, to worsen its effects.
Intrigued by Thomas Jefferson’s calendar of the Washington city market (see the previous post) and liking the design, I decided to use it as a model for mapping produce available right here, right now. So with some help from Erin Kauffman, market manager for the Durham Farmers’ Market, I compiled a produce calendar for Durham, North Carolina, 2011.
An article in today’s New York Times examines yet another case of Americans taking a fundamentally sound idea — mindful eating — and driving it to extremes. Having just concluded a draft of my book with an epilogue in which I urged not only mindful eating but (especially) mindful cooking, it pains me to say this, but, seriously, people: lighten up.
Time to get serious, now. Thanksgiving is only a day away, and if you haven’t started your preparations yet, you’d best get cracking. I don’t mean brining the turkey or kneading bread dough: I mean being thankful. The point of setting this day aside isn’t just to eat. And yet, of course, to show our gratitude, we hold a feast. How, exactly, is a feast supposed to make us thankful?
I was thinking about this question after reading my local newspaper last week, which wants me to breathe easier about Thanksgiving.
You have, no doubt, come here hoping to learn of some radical old-fashioned method for preparing cranberry sauce, some cabalistic ritual of autumn berrying well known to the ancients but lost to our rational age, the merest taste of which will produce shivers of delight claimed in one long-lost poem (once decoded and translated from the Coptic) to last three full days and create breezes that resonate in the distant tropics. Some search for wisdom, others truth or beauty: you, my friend, seek cranberry sauce.