… I recall that when Damon Knight asked me back in the ’60s whom I was reading I wrote back and said “J.R. Tolkien, G.K. Chesterton and Mark’s Engineer’s Handbook.”
I’ve been meaning to write a short essay on Gene Wolfe, the last great American writer, who died last month. I don’t know when I’ll get it done, though, so here are a some notes and quotes instead.
I don’t remember which was the first Wolfe story I read. It might have been “Trip, Trap” in an early Orbit anthology. But the novella “The Fifth Head of Cerberus” made it clear to me that he operated on a level far beyond Asimov or Clarke in skill, imagination and depth. His stories improved with re-reading. His name in the table of contents was sufficient reason to purchase any anthology, and I bought every book of his as soon as it appeared in paperback.
… I’d argue that SF represents literature’s real mainstream. What we now normally consider the mainstream—so called realistic fiction—is a small literary genre, fairly recent in origin, which is likely to be relatively short lived. When I look back at the foundations of literature, I see literary figures who, if they were alive today, would probably be members of the Science Fiction Writers of America. Homer? He would certain belong to the SFWA. So would Dante, Milton, and Shakespeare. That tradition is literature’s mainstream, and it has been what has grown out of that tradition which has been labeled SF or whatever label you want to use.
Another chapter argues we’re already living through a “soft singularity” mediated by the Internet and ubiquitous computing and communication devices. Humans with access to these technologies think and work in ways they could have hardly imagined even five years ago. When I’m putting together one of these posts, it’s not unusual that I’ll have as many as thirty browser tabs open in four or more windows for online resources which didn’t exist or were a major project to find when I joined Ricochet in 2010, and were science fiction in 1990. Our tools are changing us already, and maybe faster than many appreciate. We are in some ways, intellectually more than human as defined even ten years ago when we use them. What if the singularity happened and nobody noticed?
I have been writing since 2006 that it is more likely than not that we’re living in a simulation. This is a hypothesis we may be able to test: it’s unlikely any simulation will be perfect, and by precision investigation of physics we may be able to discover round-off errors and shortcuts in the simulation which aren’t apparent at first glance. Indeed, there are a number of nagging little discrepancies in physics and astronomy which are precisely the kinds of things we’d expect to see if living in a simulation implemented with the attention to detail we’ve come to expect from Microsoft. No red pill required, just Redmond slapdash quality!
I recently came across mention of the clinesterton beademungen. It reminded me of an old James Blish story, which is available online. Don’t move, count the seconds, and everything will be rodalent. (As I recall, Damon Knight wrote an analysis of the story that was stranger than the story itself.)
Dear [Beautiful but Evil Space Princess],
Every time I capture the hero, I get this overwhelming urge to spill the entire plan, including the way out. How can I stop myself from giving it all away?
Evil Underlord who can’t quite make the big leagues
Oh, Sweetie. This is a compulsion written into you by the author. You must use aversion therapy. Have one of your underlings dress up as the hero, and when you start spilling things, force yourself to do something really distasteful. I don’t know, pet a puppy or give sweets to children or something, until you break the compulsion.
It’s all right. If you manage to cure yourself, you can blend the puppies into a nice smoothie afterwards and it will make you feel much better.
I’m not a professional political scientist or sociologist. Then again, neither were Washington, Adams, Jefferson and that crowd ….
The election of Trump is, in many senses, stupid. However, it is far, far wiser and more in keeping with the idea that we, the people, are the defenders of the Republic to elect Trump than to elect someone who is beloved of Harvard. On the scale of errors one can make in a Republic, electing an arrogant and impulsive side-show barker is far to be prefered to electing someone whose fundamental goal is making elections irrelevant.
… humans have never had to deal with the problems that come from too much food and too much free time to consume it. We really have no idea what will come from it and how it will hurt or help society. There could very well be a huge upside to having lots of fat people. Perhaps when the zombie apocalypse comes, the zombies will eat the fat people and be satisfied, leaving the rest of us to regroup.
I’ll never forget when John Updike reviewed a book on how FDR’s policies lengthened the Great Depression. Updike basically said that because FDR cared, and was trying, that was worth more than shortening the Depression.
One food arena where the US used to be the best in the world and is now near the bottom of the pack is cider (i.e. alcoholic fermented cider.)
Back in the Revolutionary War era cider was the #1 drink in the nation, far surpassing beer or wine or hard liquor. And people had planted the right kind of apple trees all over the country (as it existed then), so there was always a big supply of the raw material.
In fact, Johnny Appleseed didn’t go around planting edible apple trees — he went around planting cider apple trees! A detail that is now lost to most people’s imaginations of history.
“But wait,” you’re saying, “there’s a difference between edible apples and cider apples?”
Yes indeed. There are three fundamental “types” of apples:
“Sweet apples,” which is what we now think of simply as “apples” — the big crunchy sweet kind that you can eat.
“Sour apples,” now mostly known as “crabapples,” which are mostly useless except for making things with their pectin.
“Bitter apples,” now mostly unknown in the US, but still planted widely in France and England. THESE are the apples you are supposed to make true cider out of. As the name implies, they’re slightly too bitter to eat, but their chemical makeup is absolutely perfect for fermenting a delicious kind of apple cider, a process during which the bitterness goes away.
If you’ve ever tasted true cider made from bitter apples (which is what they serve you in Somerset and Normandy), you’ll know that cider made from sweet apples is atrocious by comparison.
And that’s the tragic part of our story.
Because of the arrival of so many German and Bohemian and Polish immigrants in the second half of the 19th century in the US, beer started to surpass cider in popularity nationwide, and then when Prohibition hit, cider production was stopped entirely. And what happened was that ALL — or almost all — the bitter apple trees in the United States were left to die or were torn out and make room for more useful trees.
So that by the time Prohibition ended, there was no longer any way to make true cider in any quantity, and as a result beer took over the casual drinking market almost 100%. Wine only started to make inroads in the ’60s and ’70s. But cider remain completely forgotten by then.
That is until about 8 years ago, when the “small batch cider” renaissance started in the US, with small startups making cider from apples.
Sweet apples, that is — because that’s all that we have in the US anymore! Yuck!
Cider made from sweet apples is just wrong to a true cider aficionado. So no matter how much effort these America cider microbreweries put into their product, it will never match up to French and British ciders.
In fact, until just a couple years ago, most American cidermakers didn’t even know about the existence of bitter apples and didn’t know they were doing it fundamentally wrong.
Finally a few people have wised up, and they’ve started planting bitter apple trees in the US again, but it will still be several years before they are up and producing in sufficient numbers to create enough true cider for the masses.
Until then, we must suffer with an inferior American product! Frowney face!
I’m not especially worried about Skynet taking over the world any time soon given that the current state of the art in AI, with all of the best minds and the most resources behind it, is the autocorrect feature on my phone.
Today is the centenary of the birth of possibly the most original and imaginative writer of the twentieth century, R.A. Lafferty. I’ve been collecting his books ever since I read “Continued on Next Rock” in one of the Carr/Wollheim anthologies back in ancient times. I could try to explain why Lafferty is extraordinary, but it’s easier just to refer you to the short stories that are available online.
Lafferty was his favorite author in the world, he said. “His stories brimmed with ideas that no one had ever thought before. The use of language was uniquely his own —a Lafferty sentence is instantly utterly recognizable,” Gaiman wrote of Lafferty, in an introduction to the story in Martin H. Greenberg’s My Favorite Fantasy Story. “The cockeyed, strange, and wonderful world he painted in his tales often seems nearer to our own, more joyful and more recognizable than many a more worthy or more literal account by other authors the world stopped to notice.”
When he was 19, Gaiman dug Lafferty’s address out of the back of a library book and wrote to him, asking for advice on becoming an author. Tulsa, thanks to Lafferty, is for him a place of literary magic. “He told me how to become an author, and his advice was very good advice, and so I did. It left me quite certain that the finest literary advice in the world came from Tulsa, Oklahoma, for it did in my case,” Gaiman said.
Of all the mysteries in Mouretsu Pirates, the most puzzling, and the least likely to be satisfactorily explained, are the Sailor Moon shout-outs. This Princess Serenity is anything but a ditzy airhead.
By the way, it is impossible to watch just one episode of Shingu.
Whatever Carl Woese writes, even in a speculative vein, needs to be taken seriously. In his “New Biology” article, he is postulating a golden age of pre-Darwinian life, when horizontal gene transfer was universal and separate species did not yet exist. Life was then a community of cells of various kinds, sharing their genetic information so that clever chemical tricks and catalytic processes invented by one creature could be inherited by all of them. Evolution was a communal affair, the whole community advancing in metabolic and reproductive efficiency as the genes of the most efficient cells were shared. Evolution could be rapid, as new chemical devices could be evolved simultaneously by cells of different kinds working in parallel and then reassembled in a single cell by horizontal gene transfer.
But then, one evil day, a cell resembling a primitive bacterium happened to find itself one jump ahead of its neighbors in efficiency. That cell, anticipating Bill Gates by three billion years, separated itself from the community and refused to share. Its offspring became the first species of bacteriaâ€”and the first species of any kindâ€”reserving their intellectual property for their own private use. With their superior efficiency, the bacteria continued to prosper and to evolve separately, while the rest of the community continued its communal life. Some millions of years later, another cell separated itself from the community and became the ancestor of the archea. Some time after that, a third cell separated itself and became the ancestor of the eukaryotes. And so it went on, until nothing was left of the community and all life was divided into species. The Darwinian interlude had begun.
The Darwinian interlude has lasted for two or three billion years. It probably slowed down the pace of evolution considerably. The basic biochemical machinery of life had evolved rapidly during the few hundreds of millions of years of the pre-Darwinian era, and changed very little in the next two billion years of microbia evolution. Darwinian evolution is slow because individual species, once established evolve very little. With rare exceptions, Darwinian evolution requires established species to become extinct so that new species can replace them
Now, after three billion years, the Darwinian interlude is over. It was an interlude between two periods of horizontal gene transfer. The epoch of Darwinian evolution based on competition between species ended about ten thousand years ago, when a single species, Homo sapiens, began to dominate and reorganize the biosphere. Since that time, cultural evolution has replaced biological evolution as the main driving force of change. Cultural evolution is not Darwinian. Cultures spread by horizontal transfer of ideas more than by genetic inheritance. Cultural evolution is running a thousand times faster than Darwinian evolution, taking us into a new era of cultural interdependence which we call globalization. And now, as Homo sapiens domesticates the new biotechnology, we are reviving the ancient pre-Darwinian practice of horizontal gene transfer, moving genes easily from microbes to plants and animals, blurring the boundaries between species. We are moving rapidly into the post-Darwinian era, when species other than our own will no longer exist, and the rules of Open Source sharing will be extended from the exchange of software to the exchange of genes. Then the evolution of life will once again be communal, as it was in the good old days before separate species and intellectual property were invented.