blarg?

weird

Cuban Shoreline

I tried to explain to my daughter why I’d had a strange day.

“Why was it strange?”

“Well… There’s a thing called a cryptocurrency. ‘Currency’ is another word for money; a cryptocurrency is a special kind of money that’s made out of math instead of paper or metal.”

That got me a look. Money that’s made out of made out of math, right.

“… and one of the things we found today was somebody trying to make a new cryptocurrency. Now, do you know why money is worth anything? It’s a coin or a paper with some ink on it – what makes it ‘money’?”

“… I don’t know.”

“The only answer we have is that it’s money if enough people think it is. If enough people think it’s real, it becomes real. But making people believe in a new kind of money isn’t easy, so what this guy did was kind of clever. He decided to give people little pieces of his cryptocurrency for making contributions to different software projects. So if you added a patch to one of the projects he follows, he’d give you a few of these math coins he’d made up.”

“Um.”

“Right. Kind of weird. And then whoever he is, he wrote a program to do that automatically. It’s like a little robot – every time you change one of these programs, you get a couple of math coins. But the problem is that we update a lot of those programs with our robots, too. Our scripts run, our robots, and then his robots try to give our robots some of his pretend money.”

“…”

“So that’s why my day was weird. Because we found somebody else’s programs trying to give our programs made-up money, in the hope that this made-up money would someday become real.”

“Oh.”

“What did you to today?”

“I painted different animals and gave them names.”

“What kind of names?”

“French names like zaval.”

“Cheval. Was it a good day?”

“Yeah, I like painting.”

“Good, good.”

(Charlie Stross warned us about this. It’s William Gibson’s future, but we still need to clean up after it.)

Well, we have to get back to making jokes at some point. I bought some glasses from the internet.

I bought new glasses from the internet.

It didn’t go exactly as I’d hoped.

I may revisit this later. Consider this a late draft. I’m calling this done.

“Should array indices start at 0 or 1? My compromise of 0.5 was rejected without, I thought, proper consideration.” — Stan Kelly-Bootle

Sometimes somebody says something to me, like a whisper of a hint of an echo of something half-forgotten, and it lands on me like an invocation. The mania sets in, and it isn’t enough to believe; I have to know.

I’ve spent far more effort than is sensible this month crawling down a rabbit hole disguised, as they often are, as a straightforward question: why do programmers start counting at zero?

Now: stop right there. By now your peripheral vision should have convinced you that this is a long article, and I’m not here to waste your time. But if you’re gearing up to tell me about efficient pointer arithmetic or binary addition or something, you’re wrong. You don’t think you’re wrong and that’s part of a much larger problem, but you’re still wrong.

For some backstory, on the off chance anyone still reading by this paragraph isn’t an IT professional of some stripe: most computer languages including C/C++, Perl, Python, some (but not all!) versions of Lisp, many others – are “zero-origin” or “zero-indexed”. That is to say, in an array A with 8 elements in it, the first element is A[0], and the last is A[7]. This isn’t universally true, though, and other languages from the same (and earlier!) eras are sometimes one-indexed, going from A[1] to A[8].

While it’s a relatively rare practice in modern languages, one-origin arrays certainly aren’t dead; there’s a lot of blood pumping through Lua these days, not to mention MATLAB, Mathematica and a handful of others. If you’re feeling particularly adventurous Haskell apparently lets you pick your poison at startup, and in what has to be the most lunatic thing I’ve seen on a piece of silicon since I found out the MIPS architecture had runtime-mutable endianness, Visual Basic (up to v6.0) featured the OPTION BASE flag, letting you flip that coin on a per-module basis. Zero- and one-origin arrays in different corners of the same program! It’s just software, why not?

All that is to say that starting at 1 is not an unreasonable position at all; to a typical human thinking about the zeroth element of an array doesn’t make any more sense than trying to catch the zeroth bus that comes by, but we’ve clearly ended up here somehow. So what’s the story there?

The usual arguments involving pointer arithmetic and incrementing by sizeof(struct) and so forth describe features that are nice enough once you’ve got the hang of them, but they’re also post-facto justifications. This is obvious if you take the most cursory look at the history of programming languages; C inherited its array semantics from B, which inherited them in turn from BCPL, and though BCPL arrays are zero-origin, the language doesn’t support pointer arithmetic, much less data structures. On top of that other languages that antedate BCPL and C aren’t zero-indexed. Algol 60 uses one-indexed arrays, and arrays in Fortran are arbitrarily indexed – they’re just a range from X to Y, and X and Y don’t even need to be positive integers.

So by the early 1960’s, there are three different approaches to the data structure we now call an array.

  • Zero-indexed, in which the array index carries no particular semantics beyond its implementation in machine code.
  • One-indexed, identical to the matrix notation people have been using for quite some time. It comes at the cost of a CPU instruction to manage the offset; usability isn’t free.
  • Arbitrary indices, in which the range is significant with regards to the problem you’re up against.

So if your answer started with “because in C…”, you’ve been repeating a good story you heard one time, without ever asking yourself if it’s true. It’s not about *i = a + n*sizeof(x) because pointers and structs didn’t exist. And that’s the most coherent argument I can find; there are dozens of other arguments for zero-indexing involving “natural numbers” or “elegance” or some other unresearched hippie voodoo nonsense that are either wrong or too dumb to rise to the level of wrong.

The fact of it is this: before pointers, structs, C and Unix existed, at a time when other languages with a lot of resources and (by the standard of the day) user populations behind them were one- or arbitrarily-indexed, somebody decided that the right thing was for arrays to start at zero.

So I found that person and asked him.

His name is Dr. Martin Richards; he’s the creator of BCPL, now almost 7 years into retirement; you’ve probably heard of one of his doctoral students Eben Upton, creator of the Raspberry Pi. I emailed him to ask why he decided to start counting arrays from zero, way back then. He replied that…

As for BCPL and C subscripts starting at zero. BCPL was essentially designed as typeless language close to machine code. Just as in machine code registers are typically all the same size and contain values that represent almost anything, such as integers, machine addresses, truth values, characters, etc. BCPL has typeless variables just like machine registers capable of representing anything. If a BCPL variable represents a pointer, it points to one or more consecutive words of memory. These words are the same size as BCPL variables. Just as machine code allows address arithmetic so does BCPL, so if p is a pointer p+1 is a pointer to the next word after the one p points to. Naturally p+0 has the same value as p. The monodic indirection operator ! takes a pointer as it’s argument and returns the contents of the word pointed to. If v is a pointer !(v+I) will access the word pointed to by v+I. As I varies from zero upwards we access consecutive locations starting at the one pointed to by v when I is zero. The dyadic version of ! is defined so that v!i = !(v+I). v!i behaves like a subscripted expression with v being a one dimensional array and I being an integer subscript. It is entirely natural for the first element of the array to have subscript zero. C copied BCPL’s approach using * for monodic ! and [ ] for array subscription. Note that, in BCPL v!5 = !(v+5) = !(5+v) = 5!v. The same happens in C, v[5] = 5[v]. I can see no sensible reason why the first element of a BCPL array should have subscript one. Note that 5!v is rather like a field selector accessing a field in a structure pointed to by v.

This is interesting for a number of reasons, though I’ll leave their enumeration to your discretion. The one that I find most striking, though, is that this is the earliest example I can find of the understanding that a programming language is a user interface, and that there are difficult, subtle tradeoffs to make between resources and usability. Remember, all this was at a time when everything about the future of human-computer interaction was up in the air, from the shape of the keyboard and the glyphs on the switches and keycaps right down to how the ones and zeros were manifested in paper ribbon and bare metal; this note by the late Dennis Ritchie might give you a taste of the situation, where he mentions that five years later one of the primary reasons they went with C’s square-bracket array notation was that it was getting steadily easier to reliably find square brackets on the world’s keyboards.

“Now just a second, Hoye”, I can hear you muttering. “I’ve looked at the BCPL manual and read Dr. Richards’ explanation and you’re not fooling anyone. That looks a lot like the efficient-pointer-arithmetic argument you were frothing about, except with exclamation points.” And you’d be very close to right. That’s exactly what it is – the distinction is where those efficiencies take place, and why.

BCPL was first compiled on an IBM 7094here’s a picture of the console, though the entire computer took up a large room – running CTSS – the Compatible Time Sharing System – that antedates Unix much as BCPL antedates C. There’s no malloc() in that context, because there’s nobody to share the memory core with. You get the entire machine and the clock starts ticking, and when your wall-clock time block runs out that’s it. But here’s the thing: in that context none of the offset-calculations we’re supposedly economizing are calculated at execution time. All that work is done ahead of time by the compiler.

You read that right. That sheet-metal, “wibble-wibble-wibble” noise your brain is making is exactly the right reaction.

Whatever justifications or advantages came along later – and it’s true, you do save a few processor cycles here and there and that’s nice – the reason we started using zero-indexed arrays was because it shaved a couple of processor cycles off of a program’s compilation time. Not execution time; compile time.

Does it get better? Oh, it gets better:

IBM had been very generous to MIT in the fifties and sixties, donating or discounting its biggest scientific computers. When a new top of the line 36-bit scientific machine came out, MIT expected to get one. In the early sixties, the deal was that MIT got one 8-hour shift, all the other New England colleges and universities got a shift, and the third shift was available to IBM for its own use. One use IBM made of its share was yacht handicapping: the President of IBM raced big yachts on Long Island Sound, and these boats were assigned handicap points by a complicated formula. There was a special job deck kept at the MIT Computation Center, and if a request came in to run it, operators were to stop whatever was running on the machine and do the yacht handicapping job immediately.

Jobs on the IBM 7090, one generation behind the 7094, were batch-processed, not timeshared; you queued up your job along with a wall-clock estimate of how long it would take, and if it didn’t finish it was pulled off the machine, the next job in the queue went in and you got to try again whenever your next block of allocated time happened to be. As in any economy, there is a social context as well as a technical context, and it isn’t just about managing cost, it’s also about managing risk. A programmer isn’t just racing the clock, they’re also racing the possibility that somebody will come along and bump their job and everyone else’s out of the queue.

I asked Tom Van Vleck, author of the above paragraph and also now retired, how that worked. He replied in part that on the 7090…

“User jobs were submitted on cards to the system operator, stacked up in a big tray, and a rudimentary system read, loaded, and ran jobs in sequence. Typical batch systems had accounting systems that read an ID card at the beginning of a user deck and punched a usage card at end of job. User jobs usually specified a time estimate on the ID card, and would be terminated if they ran over. Users who ran too many jobs or too long would use up their allocated time. A user could arrange for a long computation to checkpoint its state and storage to tape, and to subsequently restore the checkpoint and start up again.

The yacht handicapping job pertained to batch processing on the MIT 7090 at MIT. It was rare — a few times a year.”

So: the technical reason we started counting arrays at zero is that in the mid-1960’s, you could shave a few cycles off of a program’s compilation time on an IBM 7094. The social reason is that we had to save every cycle we could, because if the job didn’t finish fast it might not finish at all and you never know when you’re getting bumped off the hardware because the President of IBM just called and fuck your thesis, it’s yacht-racing time.

There are a few points I want to make here.

The first thing is that as far as I can tell nobody has ever actually looked this up.

Whatever programmers think about themselves and these towering logic-engines we’ve erected, we’re a lot more superstitious than we realize. We tell and retell this collection of unsourced, inaccurate stories about the nature of the world without ever doing the research ourselves, and there’s no other word for that but “mythology”. Worse, by obscuring the technical and social conditions that led humans to make these technical and social decisions, by talking about the nature of computing as we find it today as though it’s an inevitable consequence of an immutable set of physical laws, we’re effectively denying any responsibility for how we got here. And worse than that, by refusing to dig into our history and understand the social and technical motivations for those choices, by steadfastly refusing to investigate the difference between a motive and a justification, we’re disavowing any agency we might have over the shape of the future. We just keep mouthing platitudes and pretending the way things are is nobody’s fault, and the more history you learn and the more you look at the sad state of modern computing the the more pathetic and irresponsible that sounds.

Part of the problem is access to the historical record, of course. I was in favor of Open Access publication before, but writing this up has cemented it: if you’re on the outside edge of academia, $20/paper for any research that doesn’t have a business case and a deep-pocketed backer is completely untenable, and speculative or historic research that might require reading dozens of papers to shed some light on longstanding questions is basically impossible. There might have been a time when this was OK and everyone who had access to or cared about computers was already an IEEE/ACM member, but right now the IEEE – both as a knowledge repository and a social network – is a single point of a lot of silent failure. “$20 for a forty-year-old research paper” is functionally indistinguishable from “gone”, and I’m reduced to emailing retirees to ask them what they remember from a lifetime ago because I can’t afford to read the source material.

The second thing is how profoundly resistant to change or growth this field is, and apparently has always been. If you haven’t seen Bret Victor’s talk about The Future Of Programming as seen from 1975 you should, because it’s exactly on point. Over and over again as I’ve dredged through this stuff, I kept finding programming constructs, ideas and approaches we call part of “modern” programming if we attempt them at all, sitting abandoned in 45-year-old demo code for dead languages. And to be clear: that was always a choice. Over and over again tools meant to make it easier for humans to approach big problems are discarded in favor of tools that are easier to teach to computers, and that decision is described as an inevitability.

This isn’t just Worse Is Better, this is “Worse Is All You Get Forever”. How many off-by-one disasters could we have avoided if the “foreach” construct that existed in BCPL had made it into C? How much more insight would all of us have into our code if we’d put the time into making Michael Chastain’s nearly-omniscient debugging framework – PTRACE_SINGLESTEP_BACKWARDS! – work in 1995? When I found this article by John Backus wondering if we can get away from Von Neumann architecture completely, I wonder where that ambition to rethink our underpinnings went. But the fact of it is that it didn’t go anywhere. Changing how you think is hard and the payoff is uncertain, so by and large we decided not to. Nobody wanted to learn how to play, much less build, Engelbart’s Violin, and instead everyone gets a box of broken kazoos.

In truth maybe somebody tried – maybe even succeeded! – but it would cost me hundreds of dollars to even start looking for an informed guess, so that’s the end of that.

It’s hard for me to believe that the IEEE’s membership isn’t going off a demographic cliff these days as their membership ages, and it must be awful knowing they’ve got decades of delicious, piping-hot research cooked up that nobody is ordering while the world’s coders are lining up to slurp watery gruel out of a Stack-Overflow-shaped trough and pretend they’re well-fed. You might not be surprised to hear that I’ve got a proposal to address both those problems; I’ll let you work out what it might be.

I’ll level with you: I’m not very good at reading code.

I had an interview the other day that featured the dreaded read-this-code segment that’s inevitable in modernity, and reading somebody else’s Python without context, with a regex or two thrown in for kicks… I know there are people who can do that really well, but man, I’m not one of them.

To try and atone for how that went, I’ve written a thing I’ve been meaning to get done for a while, a kind of high-level analysis tool for Git repositories that will be able to give you some suggestions based on historical commit information. It’s called gitcoach, and it’s over on github if you’re interested.

The idea is that it takes look at a project’s whole commit history to see what files tend to get modified at the same time and then looks at what you’re working on now; if you’re working on some file Foo, gitcoach can tell that hey, historically anyone who’s had to change Foo has also changed Bar 92% of the time, and Baz 80% of the time. So, no guarantees, but I suggest you look at those too.

There’s more you can do with that data, perhaps obviously – the nice thing about the general idea is that whenever I mention it to somebody, they think of some other thing you can do with that data that I hadn’t even considered.

So that’s something.

It’s not a finished product – there’s some known bugs and missing features listed in the README, and some others I’m sure that I don’t see yet. But there it is, and hopefully it will be useful for people trying to find their way around a big or new projects.

Sorry about the regex question, dude.

Today I learned that “I did your mom” jokes are the summit of cultured humour.

DEMETRIUS: Villain, what hast thou done?
AARON: That which thou canst not undo.
CHIRON: Thou hast undone our mother.
AARON: Villain, I have done thy mother.

That’s from Shakespeare’s Titus Andronicus, of all things.

I miss Nick; he would have looked at me like I was dumb for not knowing this.

Bloor Station

Sufficiently advanced fashion is indistinguishable from cosplay.

The obvious corollary to that is: fashion that is easily distinguished from cosplay is insufficiently advanced.

I mentioned this to somebody in passing the other day; today, my goodness, the Internet Provides:

If you wear a white coat that you believe belongs to a doctor, your ability to pay attention increases sharply. But if you wear the same white coat believing it belongs to a painter, you will show no such improvement.

So scientists report after studying a phenomenon they call enclothed cognition: the effects of clothing on cognitive processes.

It is not enough to see a doctor’s coat hanging in your doorway, said Adam D. Galinsky, a professor at the Kellogg School of Management at Northwestern University, who led the study. The effect occurs only if you actually wear the coat and know its symbolic meaning — that physicians tend to be careful, rigorous and good at paying attention.

The findings, on the Web site of The Journal of Experimental Social Psychology, are a twist on a growing scientific field called embodied cognition. We think not just with our brains but with our bodies, Dr. Galinsky said, and our thought processes are based on physical experiences that set off associated abstract concepts. Now it appears that those experiences include the clothes we wear.

See also, of course:

“It is a well known psychological fact that people’s behavior is strongly affected by the way they dress.”

But here, I’m going to do you one better: Have you heard of Endosymbiotic theory? It’s the idea that the internal structures in bacterium – and not just the bacteria in your gut, but the cells that make up a You – have evolved partly by absorbing other organisms and hosting their processes internally, a symbiosis that eventually makes them functionally indistinguishable from a single organism. Sort of the way you, looking through your eyes at this screen, feel like you’re functionally a single organism.

But you’re not. You’re colonies of symbiotic colonies all the way down. The consciousness you think of as you is an emergent pattern on the outside edge of fractal stack of organic Matryoshka dolls. A consciousness you can arbitrarily game with cosplay, letting you temporarily absorb the psychological practices of a different stack of Matryoshka colonies symbiotically into your own.

There’s no you. You don’t exist. It’s cosplay all the way up and colonies all the way down.

Dress up a little.

More Of The Same

The one thing that makes gives me more of that bone-chilling existential dread than anything else in the world, the thing that makes me question the fundamental physical underpinnings of the universe and fear the answers, is code that stops working as you’re staring at it, at the exact moment you realize that it should never have worked in the first place.

Not cool, universe. Not cool at all.

Meta, Circular

I took this picture of Maya taking a picture of a Skype session with her grandfather, in which the camera on my computer embedded a picture of her in the corner of the picture his computer took of him holding up a picture of me from when I was 12 years old, holding a camera. While thinking to myself privately that Douglas Hofstadter was, on reflection, a bit of a simpleton.

It took me a few minutes to shake this moment off, let me tell you.

16:11 < colleague> if they do a sequel I so dearly hope ben stein and charlie sheen aren't invited
16:11 < mhoye> "... Drugs?"
16:11 < mhoye> I think they have to be.
16:14 < second_colleague> why no ben stein?
16:14 < other_colleague> cause he's gone INSANE
16:16  * mhoye thinks they should swap roles.
16:16 < colleague> yeah, ben stein took a leap off the pier of reason a few years ago
16:16 < colleague> what with that anti-evolution movie, etc.
16:17 < other_colleague> "who stole ben stein's brain?"
16:19 < mhoye> A beat down, leather-clad, exhausted looking Ben Stein, sitting in a police station, turns his bruised hangover towards Jennifer Grey, and mutters "... Drugs?"
16:19 < colleague> perfect
16:25 < mhoye> Earlier in the movie a pale, drawn Charlie Sheen, his skin drumhead-taut from years off staving off a sudden transformative collapse into becoming Keith Richards, stands in front of a class of middle-aged losers in an adult high-school trying desperately to act bored and boring and failing miserably. His eyes dart around the room like a cornered animals'; he practically vibrates in place, grinding his clenched teeth together as he slowly mutters the words "Beuller? Beuller? Beuller?" over and over, desperate to hear somebody, anybody say 'cut'.
16:28 < mhoye> Meanwhile in a trailer somewhere a resigned Jeffrey Jones sits with a half-empty bottle of rye, wearing a pre-tattered suit, a scorched bowtie and the black eye makeup grafted onto his cheeks three hours ago, waiting for the knock on the door that means he's going to get pulled through the thresher again.
16:29 < mhoye> Honestly, the making-of movie here could be far, far better than the movie itself.

Seriously. A documentary about the making of a middle-aged sequel to a much-loved teen movie has the potential to be some of the darkest comedy, the most grimly existential filmmaking the world has ever seen. “Ferris’ Wheel”, I’d call it, in the spirit of Jacob’s Ladder.

UPDATE: It’s just a super-bowl ad. That’s about as saddening as possible.

“By the way, if anyone here is in marketing or advertising, kill yourself.”

– Bill Hicks.

Added the “losers”, “hate” and “fail” tags.

I don’t think I’m actually done this, so just pretend it’s a late draft. I might try to tighten it up later, but here you go; I hope you’re interested. Yeah, this is still about Portal 2, so bear with me. It’s not like Gears Of War deserves to be dissected like this, you know?

I’ve been spending some time chasing this idea around in the bowels of the Aperture Science facility, taking copious notes as I wander through the middle bits of Portal 2 again. There’s some important context here that it may help to be familiar with, but just playing through Portal 1 and 2 should be plenty.

It’s probably because I’m sentimental, but to my mind an important thing about Quest- or FPS-RPGs that doesn’t get much attention, at least as far as video games is concerned, is that you actually are playing a role. Video games differ fundamentally from most narratives (and are closer to real life, in this sense) in that you are being allowed to shape a story and participate in a universe that you don’t fully own, and can’t fully command; the character whose role you play predates your presences in that space, and has a story that is in some sense theirs, reaching forward and back beyond your brief manipulation of their limbs and choices. Sometimes you need to take the time, wherever your character finds themselves – a dungeon, a running firefight, a ruined building or an open field – to do something that’s not relevant to your goals, or even to you personally, just to do some justice to the character you’re playing.

I found a lot of the “Rat Man’s Dens” on my first playthrough, being the sort of person who looks for the seams. Specifically, I found that corner of the facility where one of the radios, rather than playing the tinny Aperture-marimba, is playing The National’s “Exile Vilify”.

Did you find it? What did you do, then? It occurred to me as I sat there that this is the first piece of music we’ve really heard, in-game. But maybe, and maybe worse, there’s a decent chance that this slow lament about the burdens of alienation might actually be the only song Chell has ever heard.

I wondered what that might do to a person, how suspicious they’d be to have found that thing in that place, and how they’d react. Is it even possible to guess how somebody might feel in that situation? I crouched down to stare at the radio, listening to it all the way through before going back to finish that test. It seemed appropriate. I doubt it had any effect on the game at all (but who can know, with Valve?) but I have a sense that my participation in the game was improved somehow by it, and it’s hard to argue with that metric.

Anyway, let’s get back on track here.

So apropos of nothing, or at least it was at the time, a few months ago I wrote about the implications of the cave in Plato’s well-known metaphor having its own agency. It’s odd that the idea would find some traction in a discussion about the plot of a video game but, I guess, where else?

The idea of immortality which appears in syncretistic religions of antiquity was introduced in late antiquity. The mysteries represented the myth of the abduction of Persephone from her mother Demeter by the king of the underworld Hades, in a cycle with three phases, the “descent” (loss), the “search” and the “ascent”, with main theme the “ascent” of Persephone and the reunion with her mother.

– Wikipedia on the Eleusinian Mysteries.

Here’s a question for you: how many protagonists are there in Portal 2? Chell, GlaDOS and Wheatley… three, right? And you’re resurrected in the midst of Aperture Science’s protracted decay, to be dropped into this forgotten, sealed off subterranean wing of Aperture after a GlaDOS and Wheatley’s first confrontation, to struggle back up the mine shaft and restore the status quo ante.

That’s the game, to a certain superficial approximation. And all of that has to be wrong; there are hundreds of little details in-game that put the lie to it. Portal 2 isn’t a simple or superficial game, not at all.

Though Demeter is often described simply as the goddess of the harvest, she presided also over the sanctity of marriage, the sacred law, and the cycle of life and death. She and her daughter Persephone were the central figures of the Eleusinian Mysteries that predated the Olympian pantheon.

– Wikipedia on Demeter

The first problem is, as I mentioned earlier, is all these little things that are where they really shouldn’t be. At the very bottom of Test Shaft 09, as you’ve passed Abandonment Seal Zulu Bunsen and entered Aperture’s antechambers, you start to see the signs that these sealed off and abandoned facilities aren’t nearly as sealed off or abandoned as you think. All the lights are still on, doors are still powered and they’re still controlled by devices with clean, white lines and modern-era lens-blade Aperture logos on the side. Likewise the hazmat warning labels on the pipes and vats as you ascend from the depths; with modern warnings and modern logos, this isn’t the long-abandoned a facility it seems to be at first glance.

There are other problems, like: in the last room before you ascend back through the containment door to modern Aperture, what activates that lift? You don’t. There’s no switches, no panels; the door just closes and up you go. The same thing happens in the moments before you meet Wheatley again; there’s stairs everywhere else but here, for no architectural reason, a lift you don’t actuate yourself hoists you up to the entrance to the next chamber.

You can see where I’m going with this by now. There aren’t three protagonists here; there are four. Portal 2 doesn’t make sense unless you consider the Aperture Science facility itself as an agent in its own right.

And it gets weirder, because it seems likely that the Aperture facility is the manifestation of its creator, Cave Johnson.

When Wheatley slams you down the shaft that drops you into the bowels of Aperture, it’s worth asking: why is that shaft even there? There’s no structural reason for it, and when you get to the bottom of it, there’s nothing else down there with you. It has to be something else. Another question worth taking a good hard look at is, what are you actually doing while you’re down there?

In Greek mythology, Tartarus is both a deity and a place in the underworld. In ancient Orphic sources and in the mystery schools, Tartarus is also the unbounded first-existing entity from which the Light and the cosmos are born.

Wikipedia, “Tartarus”

It’s pretty well-established that GlaDOS is the electronic (and likely the very much unwilling) reincarnation of Caroline, Cave Johnson’s personal assistant. It’s not much of a stretch to say that Chell is in all likelihood Caroline’s daughter, and that likely by Cave. Indeed, partway through your ascent, you get a disturbing glimpse of Chell’s backstory when you come across a slew of science fair projects: the one with the hugely overgrown potato (whose shape bears a more-than-passing resemblance to that of GlaDOS, with its roots threading up into the ceiling) has two noteworthy details, one being the line that it involved a “special ingredient from daddy’s work”, and the other being that it’s signed “Chell”.

“For the record you are adopted and that’s terrible. Just work with me.”

– GlaDOS to Chell, Portal 2.

The chronology here is ambiguous, but Chell would have to have been between about six and ten years old to have made the potato battery project. Cave Johnson’s last recorded message in the Aperture Test Spheres said unambiguously that “If I die before you people can pour me into a computer, I want Caroline to run this place. Now she’ll argue, she’ll say she can’t – she’s modest like that. But you make her! Hell, put her in my computer, I don’t care.” If Bring Your Daughter To Work Day was when everything went wrong it’s likely that Caroline, forcibly decanted into GLaDOS, has already been a victim of that process. GLaDOS stands for Genetic Lifeform and Disk Operating System; it’s not clear what being forced to be that genetic component entails, but the fact GLaDOS physically resembles a bound, blindfolded and gagged woman is I think telling, and an important part of the story.

“Sorry boys, she’s married – to science!”

– Cave Johnson, introducing Caroline in his first recorded message.

The timing seems wrong – Chell is clearly a lot older than 10, likely in her mid to late 20s in-game and it’s not clear when Cave Johnson died of the moon rock poisoning he suffered. “Daddy’s work” seems to imply that Chell’s father was still alive at the time, but it’s possible it means “from the place my Dad worked” or “created”. Either way, it’s pretty clear given the chronology that Chell really was adopted, but not by any other parents; she was adopted by Aperture. And Aperture is in a very real sense, with its vast, relentless complexity, advanced technology including “brain mapping” and its mad genius CEO, both a deity and a place.

One day they woke me up
So I could live forever
It’s such a shame the same will never happen to you.
You’ve got your short sad life left,
(That’s what I’m counting on.)
I used to want you dead but now I only want you gone.

– lyrics from Want You Gone, Portal 2’s concluding song, sung by Jonathan Coulton

What you’re really doing as you ascend through the history of Aperture from the bottom of Test Shaft 09 is resurrecting Aperture itself; resurrecting Cave, and reconnecting him to Caroline again, forever. And even though Cave ordered Caroline forcibly decanted into GLaDOS, he may not have wanted the same for his daughter, and now that the reawakened Caroline knows who she really is and who you are, she may not actually want that either.

And that’s why you’re ultimately sent away, and why Portal 2 is a weirder, creepier game than it first appears; while you’ve been solving all of these puzzle-tests, you’ve also been resurrecting your doomed parents to their respective (terrible, captive) immortalities, in the end being sent away so that “the same will never happen to you”. You made the last ascension serenaded by the facility itself; they’re left alone together as you emerge from the facility to a blue sky and a field full of tall wheat. It’s sometime in early autumn – harvest season – and you’re off to see the world, with your scorched old Companion Cube as a last going away present from your parents.