blarg?

a/b

You can think of them as the Fry and Laurie of malevolent synthetic intelligences that are going to murder you.

In a fortuitous coincidence, this video – a collection of communications from SHODAN, antagonist of the classic System Shock 2,

and this video, of GlaDOS‘ spoken dialogue from the first Portal,

… are both about 14 and a half minutes long.

You should listen to them both at the same time.

I may revisit this later. Consider this a late draft. I’m calling this done.

“Should array indices start at 0 or 1? My compromise of 0.5 was rejected without, I thought, proper consideration.” — Stan Kelly-Bootle

Sometimes somebody says something to me, like a whisper of a hint of an echo of something half-forgotten, and it lands on me like an invocation. The mania sets in, and it isn’t enough to believe; I have to know.

I’ve spent far more effort than is sensible this month crawling down a rabbit hole disguised, as they often are, as a straightforward question: why do programmers start counting at zero?

Now: stop right there. By now your peripheral vision should have convinced you that this is a long article, and I’m not here to waste your time. But if you’re gearing up to tell me about efficient pointer arithmetic or binary addition or something, you’re wrong. You don’t think you’re wrong and that’s part of a much larger problem, but you’re still wrong.

For some backstory, on the off chance anyone still reading by this paragraph isn’t an IT professional of some stripe: most computer languages including C/C++, Perl, Python, some (but not all!) versions of Lisp, many others – are “zero-origin” or “zero-indexed”. That is to say, in an array A with 8 elements in it, the first element is A[0], and the last is A[7]. This isn’t universally true, though, and other languages from the same (and earlier!) eras are sometimes one-indexed, going from A[1] to A[8].

While it’s a relatively rare practice in modern languages, one-origin arrays certainly aren’t dead; there’s a lot of blood pumping through Lua these days, not to mention MATLAB, Mathematica and a handful of others. If you’re feeling particularly adventurous Haskell apparently lets you pick your poison at startup, and in what has to be the most lunatic thing I’ve seen on a piece of silicon since I found out the MIPS architecture had runtime-mutable endianness, Visual Basic (up to v6.0) featured the OPTION BASE flag, letting you flip that coin on a per-module basis. Zero- and one-origin arrays in different corners of the same program! It’s just software, why not?

All that is to say that starting at 1 is not an unreasonable position at all; to a typical human thinking about the zeroth element of an array doesn’t make any more sense than trying to catch the zeroth bus that comes by, but we’ve clearly ended up here somehow. So what’s the story there?

The usual arguments involving pointer arithmetic and incrementing by sizeof(struct) and so forth describe features that are nice enough once you’ve got the hang of them, but they’re also post-facto justifications. This is obvious if you take the most cursory look at the history of programming languages; C inherited its array semantics from B, which inherited them in turn from BCPL, and though BCPL arrays are zero-origin, the language doesn’t support pointer arithmetic, much less data structures. On top of that other languages that antedate BCPL and C aren’t zero-indexed. Algol 60 uses one-indexed arrays, and arrays in Fortran are arbitrarily indexed – they’re just a range from X to Y, and X and Y don’t even need to be positive integers.

So by the early 1960’s, there are three different approaches to the data structure we now call an array.

  • Zero-indexed, in which the array index carries no particular semantics beyond its implementation in machine code.
  • One-indexed, identical to the matrix notation people have been using for quite some time. It comes at the cost of a CPU instruction to manage the offset; usability isn’t free.
  • Arbitrary indices, in which the range is significant with regards to the problem you’re up against.

So if your answer started with “because in C…”, you’ve been repeating a good story you heard one time, without ever asking yourself if it’s true. It’s not about *i = a + n*sizeof(x) because pointers and structs didn’t exist. And that’s the most coherent argument I can find; there are dozens of other arguments for zero-indexing involving “natural numbers” or “elegance” or some other unresearched hippie voodoo nonsense that are either wrong or too dumb to rise to the level of wrong.

The fact of it is this: before pointers, structs, C and Unix existed, at a time when other languages with a lot of resources and (by the standard of the day) user populations behind them were one- or arbitrarily-indexed, somebody decided that the right thing was for arrays to start at zero.

So I found that person and asked him.

His name is Dr. Martin Richards; he’s the creator of BCPL, now almost 7 years into retirement; you’ve probably heard of one of his doctoral students Eben Upton, creator of the Raspberry Pi. I emailed him to ask why he decided to start counting arrays from zero, way back then. He replied that…

As for BCPL and C subscripts starting at zero. BCPL was essentially designed as typeless language close to machine code. Just as in machine code registers are typically all the same size and contain values that represent almost anything, such as integers, machine addresses, truth values, characters, etc. BCPL has typeless variables just like machine registers capable of representing anything. If a BCPL variable represents a pointer, it points to one or more consecutive words of memory. These words are the same size as BCPL variables. Just as machine code allows address arithmetic so does BCPL, so if p is a pointer p+1 is a pointer to the next word after the one p points to. Naturally p+0 has the same value as p. The monodic indirection operator ! takes a pointer as it’s argument and returns the contents of the word pointed to. If v is a pointer !(v+I) will access the word pointed to by v+I. As I varies from zero upwards we access consecutive locations starting at the one pointed to by v when I is zero. The dyadic version of ! is defined so that v!i = !(v+I). v!i behaves like a subscripted expression with v being a one dimensional array and I being an integer subscript. It is entirely natural for the first element of the array to have subscript zero. C copied BCPL’s approach using * for monodic ! and [ ] for array subscription. Note that, in BCPL v!5 = !(v+5) = !(5+v) = 5!v. The same happens in C, v[5] = 5[v]. I can see no sensible reason why the first element of a BCPL array should have subscript one. Note that 5!v is rather like a field selector accessing a field in a structure pointed to by v.

This is interesting for a number of reasons, though I’ll leave their enumeration to your discretion. The one that I find most striking, though, is that this is the earliest example I can find of the understanding that a programming language is a user interface, and that there are difficult, subtle tradeoffs to make between resources and usability. Remember, all this was at a time when everything about the future of human-computer interaction was up in the air, from the shape of the keyboard and the glyphs on the switches and keycaps right down to how the ones and zeros were manifested in paper ribbon and bare metal; this note by the late Dennis Ritchie might give you a taste of the situation, where he mentions that five years later one of the primary reasons they went with C’s square-bracket array notation was that it was getting steadily easier to reliably find square brackets on the world’s keyboards.

“Now just a second, Hoye”, I can hear you muttering. “I’ve looked at the BCPL manual and read Dr. Richards’ explanation and you’re not fooling anyone. That looks a lot like the efficient-pointer-arithmetic argument you were frothing about, except with exclamation points.” And you’d be very close to right. That’s exactly what it is – the distinction is where those efficiencies take place, and why.

BCPL was first compiled on an IBM 7094here’s a picture of the console, though the entire computer took up a large room – running CTSS – the Compatible Time Sharing System – that antedates Unix much as BCPL antedates C. There’s no malloc() in that context, because there’s nobody to share the memory core with. You get the entire machine and the clock starts ticking, and when your wall-clock time block runs out that’s it. But here’s the thing: in that context none of the offset-calculations we’re supposedly economizing are calculated at execution time. All that work is done ahead of time by the compiler.

You read that right. That sheet-metal, “wibble-wibble-wibble” noise your brain is making is exactly the right reaction.

Whatever justifications or advantages came along later – and it’s true, you do save a few processor cycles here and there and that’s nice – the reason we started using zero-indexed arrays was because it shaved a couple of processor cycles off of a program’s compilation time. Not execution time; compile time.

Does it get better? Oh, it gets better:

IBM had been very generous to MIT in the fifties and sixties, donating or discounting its biggest scientific computers. When a new top of the line 36-bit scientific machine came out, MIT expected to get one. In the early sixties, the deal was that MIT got one 8-hour shift, all the other New England colleges and universities got a shift, and the third shift was available to IBM for its own use. One use IBM made of its share was yacht handicapping: the President of IBM raced big yachts on Long Island Sound, and these boats were assigned handicap points by a complicated formula. There was a special job deck kept at the MIT Computation Center, and if a request came in to run it, operators were to stop whatever was running on the machine and do the yacht handicapping job immediately.

Jobs on the IBM 7090, one generation behind the 7094, were batch-processed, not timeshared; you queued up your job along with a wall-clock estimate of how long it would take, and if it didn’t finish it was pulled off the machine, the next job in the queue went in and you got to try again whenever your next block of allocated time happened to be. As in any economy, there is a social context as well as a technical context, and it isn’t just about managing cost, it’s also about managing risk. A programmer isn’t just racing the clock, they’re also racing the possibility that somebody will come along and bump their job and everyone else’s out of the queue.

I asked Tom Van Vleck, author of the above paragraph and also now retired, how that worked. He replied in part that on the 7090…

“User jobs were submitted on cards to the system operator, stacked up in a big tray, and a rudimentary system read, loaded, and ran jobs in sequence. Typical batch systems had accounting systems that read an ID card at the beginning of a user deck and punched a usage card at end of job. User jobs usually specified a time estimate on the ID card, and would be terminated if they ran over. Users who ran too many jobs or too long would use up their allocated time. A user could arrange for a long computation to checkpoint its state and storage to tape, and to subsequently restore the checkpoint and start up again.

The yacht handicapping job pertained to batch processing on the MIT 7090 at MIT. It was rare — a few times a year.”

So: the technical reason we started counting arrays at zero is that in the mid-1960’s, you could shave a few cycles off of a program’s compilation time on an IBM 7094. The social reason is that we had to save every cycle we could, because if the job didn’t finish fast it might not finish at all and you never know when you’re getting bumped off the hardware because the President of IBM just called and fuck your thesis, it’s yacht-racing time.

There are a few points I want to make here.

The first thing is that as far as I can tell nobody has ever actually looked this up.

Whatever programmers think about themselves and these towering logic-engines we’ve erected, we’re a lot more superstitious than we realize. We tell and retell this collection of unsourced, inaccurate stories about the nature of the world without ever doing the research ourselves, and there’s no other word for that but “mythology”. Worse, by obscuring the technical and social conditions that led humans to make these technical and social decisions, by talking about the nature of computing as we find it today as though it’s an inevitable consequence of an immutable set of physical laws, we’re effectively denying any responsibility for how we got here. And worse than that, by refusing to dig into our history and understand the social and technical motivations for those choices, by steadfastly refusing to investigate the difference between a motive and a justification, we’re disavowing any agency we might have over the shape of the future. We just keep mouthing platitudes and pretending the way things are is nobody’s fault, and the more history you learn and the more you look at the sad state of modern computing the the more pathetic and irresponsible that sounds.

Part of the problem is access to the historical record, of course. I was in favor of Open Access publication before, but writing this up has cemented it: if you’re on the outside edge of academia, $20/paper for any research that doesn’t have a business case and a deep-pocketed backer is completely untenable, and speculative or historic research that might require reading dozens of papers to shed some light on longstanding questions is basically impossible. There might have been a time when this was OK and everyone who had access to or cared about computers was already an IEEE/ACM member, but right now the IEEE – both as a knowledge repository and a social network – is a single point of a lot of silent failure. “$20 for a forty-year-old research paper” is functionally indistinguishable from “gone”, and I’m reduced to emailing retirees to ask them what they remember from a lifetime ago because I can’t afford to read the source material.

The second thing is how profoundly resistant to change or growth this field is, and apparently has always been. If you haven’t seen Bret Victor’s talk about The Future Of Programming as seen from 1975 you should, because it’s exactly on point. Over and over again as I’ve dredged through this stuff, I kept finding programming constructs, ideas and approaches we call part of “modern” programming if we attempt them at all, sitting abandoned in 45-year-old demo code for dead languages. And to be clear: that was always a choice. Over and over again tools meant to make it easier for humans to approach big problems are discarded in favor of tools that are easier to teach to computers, and that decision is described as an inevitability.

This isn’t just Worse Is Better, this is “Worse Is All You Get Forever”. How many off-by-one disasters could we have avoided if the “foreach” construct that existed in BCPL had made it into C? How much more insight would all of us have into our code if we’d put the time into making Michael Chastain’s nearly-omniscient debugging framework – PTRACE_SINGLESTEP_BACKWARDS! – work in 1995? When I found this article by John Backus wondering if we can get away from Von Neumann architecture completely, I wonder where that ambition to rethink our underpinnings went. But the fact of it is that it didn’t go anywhere. Changing how you think is hard and the payoff is uncertain, so by and large we decided not to. Nobody wanted to learn how to play, much less build, Engelbart’s Violin, and instead everyone gets a box of broken kazoos.

In truth maybe somebody tried – maybe even succeeded! – but it would cost me hundreds of dollars to even start looking for an informed guess, so that’s the end of that.

It’s hard for me to believe that the IEEE’s membership isn’t going off a demographic cliff these days as their membership ages, and it must be awful knowing they’ve got decades of delicious, piping-hot research cooked up that nobody is ordering while the world’s coders are lining up to slurp watery gruel out of a Stack-Overflow-shaped trough and pretend they’re well-fed. You might not be surprised to hear that I’ve got a proposal to address both those problems; I’ll let you work out what it might be.

Yesterday on the subway I watched a man write “KEY INSIGHTS” at the top of a page in his Moleskine, and then just stare at the page unmoving for the next six stops. He hadn’t budged when I stepped off to switch trains; I have to admit that as the minutes ticked by, I struggled not to start laughing right there. “ZOMG Thought Leadership Liek Woah”, I was thinking.

This morning I realized I’d been staring at an email window with a “To:” line, a title, and a cursor blinking away in an otherwise empty editor for at least five minutes, maybe more.

Sorry, key-insights-on-the-subway-guy. The inside of my head could have been a little more sympathetic, it turns out.

On A Certain Island

Maya and I have been playing through Windwaker together; she likes sailing, scary birds and remembering to be brave, rescuing her little brother and finding out what’s happening to Medli and her dragon boat.

She’s the hero of the story, of course.

It’s annoying and awkward, to put it mildly, having to do gender-translation on the fly when Maya asks me to read what it says on the screen. You can pick your character’s name, of course – I always stick with Link, being a traditionalist – but all of the dialog insists that Link is a boy, and there’s apparently nothing to be done about it.

Well, there wasn’t anything to be done about it, certainly not anything easy, but as you might imagine I’m not having my daughter growing up thinking girls don’t get to be the hero and rescue their little brothers.

This isn’t particularly user-friendly; you’ll need to download the Dolphin emulator and find a Windwaker .GCM, the Gamecube disk image with this SHA-1 hash:

Original: 6b5f06c10d50ebb4099cded88217eb71e5bfbb4a

and then you’ll need to figure out how to use xdelta3 to apply a binary patch to that image.

This patch.

When you’re done the resulting disk image will have the following SHA-1 hash:

Result: 6a480ffd8ecb6c254f65c0eb8e0538f7b30cfaa7

… and all the dialog will now refer to Link as a young woman, rather than as a young man.

I think I’ve gotten this right – this was all done directly on the original disk image with a hex editor, so all the changes needed to be the same byte-for-byte length, in-place. I haven’t had time to play through the whole game to test it yet, and some of the constructions aren’t perfect. I’ve borrowed Donaldson’s “Swordmain” coinage to replace “Swordsman”, for example, and there’s lots of “milady” replacing “my lad” and “master”, because I couldn’t find a better way to rewrite them in exactly the amount of space allotted. If you come up with something better, I’m all ears.

I’m going to audit it shortly, and may update this post to reflect that. For now, though, here you go.

FemLink or you’re doing it wrong.

More Underground

You may have heard that the FunnyJunk website – no link, but it’s your typically garish stolen-content-to-sell-ads web-hovel – have tried to extort one of the people they stole stuff from, to the tune of about $20k.

The Tubes Were Displeased:

“I really did not expect that he would marshal an army of people who would beseech my website and send me a string of obscene emails,” he says.

“I’m completely unfamiliar really with this style of responding to a legal threat — I’ve never really seen it before,” Carreon explains. “I don’t like seeing anyone referring to my mother as a sexual deviant,” he added, referencing the drawing Inman posted.

[...]

In the meantime, Inman is trying to figure out how to explain that he needs to withdraw over $100,000 so that he can photograph it next to a drawing of someone’s mother attempting to sweet-talk a bear.

He raised the $20k for charity in one hour and four minutes. That number currently stands at just shy of $120,000.

God, I love the internet.

UPDATE: The situation has escalated.

Don't Interrupt

We took Arthur’s Science Fair Trouble out of the library for Maya the other day, and let me tell you: I had always suspected that most of what adults tell you is bullshit, but children’s books live at some horrible Venn overlap of Moore and Sturgeon’s respective Laws where 90% of everything is not only crap but getting twice as crappy every year and a half or so.

I had to go over this book carefully with Maya after I read it, to explain to her why every single part of it is wrong. The description from the dust cover reads:

Arthur has to do a science fair project, but all of the good ideas are taken: Buster is building a rocket, Muffy is growing crystals, and Francine is making a bird feeder. Arthur learns a valuable lesson when he finds his father’s old solar system project in the attic and tries to use it for his own science fair project.

That’s right: Arthur’s in a pickle, because all the good science ideas have been done by other children doing wholly original work. But when Arthur instead decides to update his father’s old solar system project (repainting it) and presenting that he feels, we are told, terribly guilty, finally breaking down after winning first prize to admit the work wasn’t wholly his. He is suitably chastised, of course.

I don’t think Maya understood my rant about why verifying old assumptions was incredibly valuable, not merely per se but particularly in light of Pluto’s redefined status and the inclusion of Eris and Ceres in the “Dwarf Planet” category as well.

I had to explain to her Arthur was explaining the evolution of cosmology by repurposing and updating older (handmade by his father!) demonstration materials, which is not only great on its own, but vastly better scientific and expository work than his classmates’ projects, who were showing no insight into why assembling premanufactured toys might not count as science.

“Maya, the people harassing Arthur for this are lazy, ignorant people saying dumb things to make Arthur feel bad, and Arthur is wrong to feel bad about his work. Building on top of each others’ work is the only reason we have this world of incredible, miraculous wonder we live in, and don’t let anyone tell you otherwise.”

I don’t think it stuck, but I’ll keep repeating it.

I was thinking about this today when this quote from Mark Twain on plagiarism started making the rounds:

Mark Twain, letter to Helen Keller, after she had been accused of plagiarism for one of her early stories (17 March 1903), published in Mark Twain’s Letters, Vol. 1 (1917) edited by Albert Bigelow Paine, p. 731:

Oh, dear me, how unspeakably funny and owlishly idiotic and grotesque was that “plagiarism” farce! As if there was much of anything in any human utterance, oral or written, except plagiarism! The kernal, the soul — let us go further and say the substance, the bulk, the actual and valuable material of all human utterances — is plagiarism. For substantially all ideas are second-hand, consciously and unconsciously drawn from a million outside sources, and daily used by the garnerer with a pride and satisfaction born of the superstition that he originated them; whereas there is not a rag of originality about them anywhere except the little discoloration they get from his mental and moral calibre and his temperament, and which is revealed in characteristics of phrasing. When a great orator makes a great speech you are listening to ten centuries and ten thousand men — but we call it his speech, and really some exceedingly smail portion of it is his. But not enough to signify. It is merely a Waterloo. It is Wellington’s battle, in some degree, and we call it his; but there are others that contributed. It takes a thousand men to invent a telegraph, or a steam engine, or a phonograph, or a photograph, or a telephone or any other important thing—and the last man gets the credit and we forget the others. He added his little mite — that is all he did. These object lessons should teach us that ninety-nine parts of all things that proceed from the intellect are plagiarisms, pure and simple; and the lesson ought to make us modest. But nothing can do that.

Which is all to say: Constant vigilance!

Zooming

It’s an old joke, with that wonderful undercurrent of bigoted misogyny that so many old jokes have: some creepy old dude propositions young woman by asking if she’d sleep with him for a million dollars, which she concedes she would. He follows that up asking if she’d sleep with him for a nickel; she replies, of course not, what kind of person do you think I am?

“We’ve established that”, he replies. “Now we’re just haggling about the price.”

The sort of horrid old joke told by horrid old people, to be sure, but there’s a tiny kernel of capital-T Truth in there: we should be honest with ourselves, at the very least, about when we’re talking about matters of principle or when we’re dickering over the price tag, and what that means about us.

Exhibit 1: George Lucas testifying before Congress in 1998 about copyright and the importance of artistic integrity.

“The destruction of our film heritage, which is the focus of concern today, is only the tip of the iceberg. American law does not protect our painters, sculptors, recording artists, authors, or filmmakers from having their lifework distorted, and their reputation ruined. If something is not done now to clearly state the moral rights of artists, current and future technologies will alter, mutilate, and destroy for future generations the subtle human truths and highest human feeling that talented individuals within our society have created.”

“[...] People who alter or destroy works of art and our cultural heritage for profit or as an exercise of power are barbarians, and if the laws of the United States continue to condone this behavior, history will surely classify us as a barbaric society. The preservation of our cultural heritage may not seem to be as politically sensitive an issue as “when life begins” or “when it should be appropriately terminated,” but it is important because it goes to the heart of what sets mankind apart. Creative expression is at the core of our humanness. Art is a distinctly human endeavor. We must have respect for it if we are to have any respect for the human race.”

“These current defacements are just the beginning. Today, engineers with their computers can add color to black-and-white movies, change the soundtrack, speed up the pace, and add or subtract material to the philosophical tastes of the copyright holder. Tomorrow, more advanced technology will be able to replace actors with “fresher faces,” or alter dialogue and change the movement of the actor’s lips to match. It will soon be possible to create a new “original” negative with whatever changes or alterations the copyright holder of the moment desires. The copyright holders, so far, have not been completely diligent in preserving the original negatives of films they control. In order to reconstruct old negatives, many archivists have had to go to Eastern bloc countries where American films have been better preserved.”

“In the future it will become even easier for old negatives to become lost and be “replaced” by new altered negatives. This would be a great loss to our society. Our cultural history must not be allowed to be rewritten.”

Exhibit 2: Dancing to “I’m Han Solo”, in the Kinect Star Wars video game, a rewritten version of Jason Derulo’s “Ridin’ Solo”

I’m feeling like a star,
You can’t stop my shine.
I’m lovin’ Cloud City,
My head’s in the sky.

I’m solo, I’m Han Solo.
I’m Han Solo.
I’m Han Solo, Solo.

Yeah, I’m feelin’ good tonight,
Finally feelin’ free and it feels so right, oh.
Time to do the things I like,
Gonna see a Princess, everything’s all right, oh.
No Jabba to answer to,
Ain’t a fixture in the palace zoo, no.
And since that carbonite’s off me,
I’m livin’ life now that I’m free, yeah.

Told me to get myself together,
Now I got myself together, yeah.
Now I made it through the weather,
Better days are gonna get better.
I’m so happy the carbonite is gone,
I’m movin’ on.
I’m so happy that it’s over now,
The pain is gone.

I’m puttin’ on my shades
to cover up my eyes.
I’m jumpin’ in my ride,
I’m heading out tonight

I’m solo, I’m Han Solo.
I’m Han Solo.
I’m Han Solo, Solo.

I’m pickin’ up my blaster,
Put it on my side,
I’m jumpin’ in my Falcon,
Wookie at my side.

I’m solo, I’m Han Solo.
I’m Han Solo.
I’m Han Solo, Solo.

Possibly the worst part being that this is actually an inoffensive, blandly-rehashed second-order derivative of a parody MC Chris did better.

UPDATE: Scroll down. But they’re still finished, make no mistake about that.

I’ve mentioned in the past that RIM’s fundamental problem is that they’ve been shipping the same goddamn device, over and over again, since at least 2004. But check this out: on the heels of Blackberry’s recent announcements of collapsing financials and a management purge, I’ve just been informed that a new simulator is available for the upcoming Blackberry 9220, for developers to test on.

Noteworthy features include:

  • 320 x 240 resolution, 164 dpi
  • Memory: 512 MB Internal Persistent Storage, 512 MB RAM
  • 2 MP Camera, 5 X digital zoom
  • FM Radio

It apparently will play video, though at a maximum of 15 frames per second.

It’s got more memory, and adds wireless-N to the B/G (and, woo, an FM radio) but that’s the same screen and camera resolution that shipped in the Blackberry Curve 8320.

That shipped in 2007.

What. I. Do. Not. Even.

So, if you have RIM stock and haven’t gotten rid of it already, get out now.

UPDATE – Brought to me by Jasper in the comments: holy crap, check this out. The first mention of the 9220, dated 2008. Given that specs made perfect sense in 2008, this is craziness – Jasper rightly observes, it’s either been in development hell for four years, or they’ve just found a warehouse full of them somewhere and they’ve got to figure out how to get them out the door before RIM goes belly up for good.

I’m pretty sure – judging from the last of the comments – that this is just a numerical overlap. The first 9220 looks like the one they’re preparing to ship now, and the “9220 Curve” mentioned later in the comments (with specs that are significantly better than the 9220 of today, bizarrely) simply doesn’t exist.

Nevertheless – what a gong show.

Kevin Gildea is hard to google.

He’s an English professor in the Ottawa area, part-time (from what I can tell) at both Ottawa U and Carleton. When I was in his class a decade ago, he never gave you the sense of being self-aggrandizing enough to have a web presence, much less the fan base he should. He’s the only professor I’ve ever had who in a single lecture managed to completely dismantle and rebuild my sense of self and place, and change the whole direction my life has taken.

He had Nietzsche’s help to do it, but hey, backstory time; I was having a shitty year at the tail end of a series of shitty years, partway through a degree I didn’t know if I wanted or cared about or not, staring down a future I didn’t know if I wanted or not, and not really having a sense of what I could do, or if there was anything I could do, about any of it. And what he said, approximately, was this:

Suppose, for a moment, that space is finite. Space is finite, matter is finite. Time is infinite. Take that as your axioms. What does that mean? A lot of things, but one of the things it means is that there’s a finite number of ways that the matter in all the space can all fit together. So eventually, everything will repeat itself: all of us are going to be here in this exact room, having this exact conversation, again. And again. And then he said, who’s responsible for your situation? Not who’s fault is it, but who is responsible for it? If not you, who else could it be? And if you’re unhappy, what’s keeping you unhappy, if not your choice to remain where you are?

If you don’t like where you are, what’s stopping you from changing that except you?

Nietzsche said it like this:

“What if a demon crept after thee into thy loneliest loneliness some day or night, and said to thee: “This life, as thou livest it at present, and hast lived it, thou must live it once more, and also innumerable times; and there will be nothing new in it, but every pain and every joy and every thought and every sigh, and all the unspeakably small and great in thy life must come to thee again, and all in the same series and sequence-and similarly this spider and this moonlight among the trees, and similarly this moment, and I myself. The eternal sand-glass of existence will ever be turned once more, and thou with it, thou speck of dust!”- Wouldst thou not throw thyself down and gnash thy teeth, and curse the demon that so spake? Or hast thou once experienced a tremendous moment in which thou wouldst answer him: “Thou art a God, and never did I hear anything so divine!”

It doesn’t matter if the postulates are scientifically true or not, because you are here, now. This is a way of thinking; what would you have to do, who would you have to become to own your choices without remorse or fear or nagging doubt, to be able to say honestly that you don’t fear being here in this moment, again and again, forever?

Longtime readers will note that I’ve mentioned it once or twice before; it made quite an impression. But I’ve occasionally had the sense that much like Nietzsche’s abyss, as I try to embrace the ideal, the ideal tries also to embrace me.

Wife has sudden back pains? Surprise early trip to the hospital? Yeah, I know this game.
Posted by mhoye at 11:56 PM – 22 Feb 12

That was Wednesday night: surprise, we’re doing it all over again. Except this time we’re a week pre-term.

At about 9:00 Wednesday night, Arlene went from feeling mildly uncomfortable to agonizing contractions in the space of twenty minutes. We called the hospital, and they say that when they start coming five minutes apart and lasting for twenty seconds, you should come in. So we start timing them, and for the next twenty minutes they’re three minutes apart and lasting for thirty seconds. We call them back, and this time we’re not asking. Maya’s mercifully asleep, and we called a friend over to keep an eye on the house while we pile the bags into the car and roll off to the hospital. Parking lot, wheelchair, triage, nice and chilly, keep moving.

Suppose for a moment that bed space is finite and time is of the essence. They booked us into the same room we were in last time, room 707.

Really world, I was thinking, that’s how we’re going to play this? Really?

Ok.

I’ve done this before; I know I’m tooled up for this work. If that’s how it’s going to be, let’s get started.

Bring it.

I mentioned this last time: when I say things like that, the universe hears me. And that night, the universe obliged. In the broad strokes, we did the whole thing again: sudden onset labor, epidural, low progression, instrumentation, c-section, all of it. But to my surprise all of that happened on what looked for lack of a better term like Easy Mode; slower, more predictably, better-managed and almost entirely crisis-free.

I didn’t hear the words “crash”, “emergency” or “distress” mentioned even once.

We got triaged quickly and cleanly, on a night where the hospital wasn’t clearly overloaded and threatening to go off the rails. The epidural went in on the first try instead of the eighth. The labor took a long time, but didn’t fail out dramatically at any point. I got to track Arlene’s progress on the printed readouts over the day, and talk to the staff about hour-over-hour trends instead of hearing them mutter nervously about the last five minutes; one nurse complimented me on that, which was nice. The decision to go in for a caesarean section was made in a calm room full of people with the time to give the question its due consideration.

The long wait alone in the room after Arlene had been wheeled into the operating room to get prepped was about three times as long as the last one, and it was a hard wait; how could it be anything else? But it wasn’t the bone-charring nightmare fuel I very, very seriously expected. I have a lot of confidence in the East York General staff now; I know we’re getting through this.

Carter came out looking like a slightly smallish, slightly beat up and other wise utterly normal kid. I had him lying on my chest and my wife’s hand in the other when the only real excitement of the day started and my wife got very pale and started shaking uncontrollably. Because the Eternal Recurrence of the Same has a checklist and a schedule apparently, and time was moving on so chop chop let’s get it all in we have a deadline people.

So, funny story. And by funny I mean fuck you, universe.

The operating surgeon paged the anesthesiologist, but he was already answering another emergency page. So I’m the only person on this side of the curtain who can look at the relevant instrumentation, an appliance the size of a vending machine. As you might imagine this is the first time I’ve ever seen anything like it; the surgeon is asking for a blood pressure reading, everyone else in the room is busy helping her sew my shaking wife’s guts back together and here I am with my newborn son in one hand, my eyes skittering around the controls as I try to learn how to operate a brand new machine that’s wired directly into my wife’s anatomy through, fuck you universe, a series of tubes.

As moments go, it was perfect.

The old reflexes die hard, and the old intuitions never go away. I’ve never done this before, but I’ve done this before. I’m tooled up for this; let’s get started.

“Can we get pulse and a BP reading?”

“The, um… one second. The… The cuff’s deflated, I think. There’s a null reading on the screen where it says BP. Hang on… OK, there’s one button here that says NIBP; I’m pushing it, one moment.” *click*.

I’d guessed “non-something blood pressure”, and the button was just below the null readout I was looking at. The cuff started inflating immediately. Anyone who tells you that user interface design doesn’t matter is a fool, this stuff save lives.

“Pulse is… 109, BP looks like it’s going to take a minute.”

“Ok, thank you. Good work.”

“I told you we should keep him around to look at stuff”, one of the nurses said.

Black Eye

Carter Alan Hoye, born 6:30 or so Thursday, February 23rd, and if he looks like he’s been in a fight, that’s because he’s been in a fight.

Arlene was wheeled back to our recovery room, and after treating her shakes with some drugs and a heated blanket, she’s made a shocking recovery. She was lucid in hours, able to walk and eat solid foods in a day. She and Carter are back at the hospital today to treat him for some jaundice, but both of them are recovering from the ordeal surprisingly well. Carter is a cause for mild concern, because he’s lost a bit of weight since his birth, but my own belief is that’s only because he’s losing fluids as the swelling subside; the poor guy was bruised all over from the protracted labor. He looks much different now, and I’ll have more pictures on the way soon.

I’d honestly forgotten they make them that small.

Maya has been struggling a bit; she seems to like Carter but dislike not being the center of attention. And she got a bit scared during our absence at the hospital, so we’ve got to make it up to her over the next few days. We’ll have to figure that out, but we’ve got time.

I’ve received a lot of messages, via Twitter and email, wishing us well. I’m grateful for all of them; they mean a lot to me. When the universe decides to try to knock you around some, there’s no better feeling that knowing you’ve got great friends.

16:11 < colleague> if they do a sequel I so dearly hope ben stein and charlie sheen aren't invited
16:11 < mhoye> "... Drugs?"
16:11 < mhoye> I think they have to be.
16:14 < second_colleague> why no ben stein?
16:14 < other_colleague> cause he's gone INSANE
16:16  * mhoye thinks they should swap roles.
16:16 < colleague> yeah, ben stein took a leap off the pier of reason a few years ago
16:16 < colleague> what with that anti-evolution movie, etc.
16:17 < other_colleague> "who stole ben stein's brain?"
16:19 < mhoye> A beat down, leather-clad, exhausted looking Ben Stein, sitting in a police station, turns his bruised hangover towards Jennifer Grey, and mutters "... Drugs?"
16:19 < colleague> perfect
16:25 < mhoye> Earlier in the movie a pale, drawn Charlie Sheen, his skin drumhead-taut from years off staving off a sudden transformative collapse into becoming Keith Richards, stands in front of a class of middle-aged losers in an adult high-school trying desperately to act bored and boring and failing miserably. His eyes dart around the room like a cornered animals'; he practically vibrates in place, grinding his clenched teeth together as he slowly mutters the words "Beuller? Beuller? Beuller?" over and over, desperate to hear somebody, anybody say 'cut'.
16:28 < mhoye> Meanwhile in a trailer somewhere a resigned Jeffrey Jones sits with a half-empty bottle of rye, wearing a pre-tattered suit, a scorched bowtie and the black eye makeup grafted onto his cheeks three hours ago, waiting for the knock on the door that means he's going to get pulled through the thresher again.
16:29 < mhoye> Honestly, the making-of movie here could be far, far better than the movie itself.

Seriously. A documentary about the making of a middle-aged sequel to a much-loved teen movie has the potential to be some of the darkest comedy, the most grimly existential filmmaking the world has ever seen. “Ferris’ Wheel”, I’d call it, in the spirit of Jacob’s Ladder.

UPDATE: It’s just a super-bowl ad. That’s about as saddening as possible.

“By the way, if anyone here is in marketing or advertising, kill yourself.”

– Bill Hicks.

Added the “losers”, “hate” and “fail” tags.