blarg?

A friend of mine points me to this incredible New York Times article in which publishers lay out the fact that they are fundamentally opposed to public libraries, detailing their struggles as they take up arms against these nefarious institutions promoting such injustices as culture, literacy and the greater public good.

Ms. Thomas of Hachette says: “We’ve talked with librarians about the various levers we could pull,” such as limiting the number of loans permitted or excluding recently published titles. She adds that “there’s no agreement, however, among librarians about what they would accept.”

It’s really a great article, full of these little turns of phrase that seem to come out of publisher’s mouths without them even realize how evil they sound. “There’s no agreement among librarians to bend themselves, the public and the greater good over this barrel we’ve offered to sell them at a very reasonable rate”, they don’t quite say.

HarperCollins was brave to tamper with the sacrosanct idea that a library can do whatever it wishes with a book it obtains.

This sacrosanct idea is better known as the First-Sale Doctrine; those crafty librarians, always falling back things like “established law” and “century-old Supreme Court decisions” to make their case. Crazytimes, right?

But that’s not the best bit:

David Young, Hachette’s chief executive, says: “Publishers can’t meet to discuss standards because of antitrust concerns. This has had a chilling effect on reaching consensus.”

Mr. Young lays it flat out: that laws prohibiting anticompetitive collusion and price-fixing are having a “chilling effect” on major publishers’ attempts to collude, fix prices and thwart competition.

I can’t imagine a functioning adult saying this with a straight face, but there it is. “Laws against doing evil things are having a chilling effect on the efforts of aspirant evildoers.” I’m sure it’s a problem for somebody, but as far as I’m concerned, mission accomplished, gold stars all ’round, well done laws and keep up the good work.

As has been noted many times, by many people, we’ve juiced up the entirely artificial copyright laws of the world to the point that if libraries weren’t already a centuries-old cultural institution, there’s no chance they’d ever be able to come into existence today. And here in this miraculous age of free-flowing information, that’s sad as hell.

According to Wolfram Alpha, there are 2.9 x 10^6 dietary calories in a cubic meter of cheese, 142829% of your recommended daily caloric intake.

Furthermore, there are 8.468×10^47 cubic meters in a cubic light year. From this, we can conclude that there are 2.455 x 10^54 dietary calories in a cubic light year of cheese.

According to NASA the sun produces 3.8 x 10^33 ergs/sec or roughly 3.8 x 10^26 joules/sec. Over the course of a year that adds up to approximately 6.065 x 10^37 joules of energy.

One dietary calorie or “kilocalorie” equals about 4180 joules. Doing the math we conclude it will take 1.7 x 10^20 years for our sun to generate the same amount of energy as a cubic light year of cheese.

Be warned, however, that at 977 kilograms per cubic meter, or 8.27 × 10^50 kilograms per cubic light year, the Schwarzchild Radius of a cubic light year of cheese would be 1.23 × 10^24 meters, significantly greater than the 9.46 x 10^15 meters in a light year. From this we can conclude that a cubic light year of cheese, should that somehow manifest itself, will immediately collapse into a black hole.

So while you would think a cubic light year of cheese would be the obvious choice over the sun, if you are presented with a choice between them, the numbers suggest you would be far better off choosing the sun.

These numbers assume cheese of approximately constant density. Swiss cheeses require much more sophisticated modelling.

(This article has been updated to reflect a comment from Jin, seen below, who notes that Wolfram returns dietary calorie units, which is to say kilocalories, rather than simply calories. The original claim, that it would take the sun 1.7 x 10^17 years to generate the same amount of energy as is contained in a cubic light-year of cheese was inaccurate, and has been corrected above. The author sincerely regrets any inconvenience this may have caused.)

I’ve read me some books recently, Ready Player One and two of the Last Chronicles Of Thomas Covenant, and they could not possibly more different.

Ready Player One is a nerdculture bender of a book, about as hard to hate while you’re in the middle of it as it is to love in hindsight; it’s young adult literature for people who were born in the late seventies and haven’t really grown up yet. Of which I am apparently one, it has become clear, but you’re still left with the sense that you’re reading a Cory Doctorow book whose discerning virtue is that the lead isn’t a thinly-veiled Cory Doctorow. Which is a huge, huge improvement, make no mistake, but it’s still relentless, pandering fanservice.

I enjoyed it anyway, I think not because it was strictly good and certainly not because it’s without other flaws, but because it’s targeted with such mathematical precision at my child-of-the-eighties-whose-parents-could-afford-a-PC demographic that I felt obliged to at least appreciate the craft.

Even so, I’ve often said that some works don’t age well but this is the first time I’ve ever felt that way about something in less than a month.

The various Chronicles Of Thomas Covenant, veterans of that series will agree, are the exact opposite of self-congratulatory nerdpop. Differing from RPO in every imaginable respect, maybe the most important distinction is that the primary characters absolutely, relentlessly hate themselves, loathing their own dispositions and actions at baroquely-detailed length at every pause in the narrative’s forward motion. It’s not even a little unusual for a character to spend half a page considering how terrible they are and how miserable they’ve made everyone else shimmed into the space between somebody asking them a question and their answering it. But Donaldson’s built a solid career out of this signature combination of nuclear-winter morality and arcane linguistic affect, so much like Ready Player One enjoying it seems less important at times than respecting the craft. Having said that, the depth of the world and breadth of the landscape is great; the world-building and supporting cast are fantastic, getting all the good lines and stealing all the best scenes. Smartly written and compelling enough to more than make up for the lead characters spending so much time wallowing in their own self-loathing.

But like every reference in Ready Player One, I was introduced to this series very young. There’s a saying, that the Golden Age of Science Fiction is “twelve”; I have a sense that in both cases these stories aren’t really getting their hooks into me, just tying into the anchors anyone my age had bolted on decades ago. Does that matter, if I enjoy it regardless? I’m not sure but I get the sense that it does, and it feels like cheating.

The Window

You’re no doubt familiar with the old horror-movie bit of the walking, lumbering monster being able to chase down a victim who’s running hard to get away from them. You know the drill: it doesn’t matter how hard, fast or far they’ve run, they could have the stamina of a marathoner and the speed of a sprinter: the moment they stop to catch their breath the monster is there, chainsaw, claws, mandibles or lurching undeadness to hand.

I’ve long thought that classic scares like that come from some common antecedent lodged deep in the collective unconscious, the common experiences that so many of us unsuspectingly have. But I hadn’t really thought about where that particular one might come from until I was trying to catch up with my daughter as she took off down the block, running flat out as fast as a two-year-old can go. While I walked after her at a stately pace, eventually catching her without particular effort.

So if you’re wondering what the original of that particular horror trope is, there you are.

It’s me.

I’ve mentioned this sort of thing before, but nevertheless: this is a really terrible piece of writing.

I remain convinced that the best way to stop a bully is not to go mewling to the teacher, who will only call the victim’s mummy, or to your own mummy, who will only call the teacher. The best way is to take the bully out for a short pounding after school – and may I make it plain, please, that I don’t mean the victims should do this, but rather others. The onus for stopping bullies lies not with the people being bullied, but with those who see it happen.

There’s much to find reprehensible here, not the least of which is the “I have lots of gay friends” non-defense. And it’s wholly unsurprising to find the National Post giving somebody a pulpit to tell us why beating up children is good for the children, and for society. But the thing that struck me about it was how the writing calls to mind this 1957 picture of one of the Little Rock Nine, Elizabeth Eckford. And more importantly of Hazel Bryan in the background, dripping with hate, screaming at the future.

It’s all that came to mind when I was reading Blatchford’s article; it’s crystal clear that the author has never given or received the abuse she advocates. She’d never deign to get any of the blood she’s calling for on her actual hands. She’s deniably, blameless part of the mob, shrieking for violence to be meted out by somebody else for no better reason than wanting to watch.

The worst thing about Linux, bar none, is the “I’ve been stupid” feeling you get when you only just now find the simple tool that’s been around forever, that solves some nagging problem you’ve had just as forever. I swear to God, I did not know what “cd -” did until just now.

Mental note: look for those solutions much sooner than later. They’re in there.

Danger

I had a race condition eat my week. Stick around, I’ll tell you what that’s about.

Here’s some IRC, with some comments.

10:29 < mhoye> Oh, man. I think I've finally fixed this bug.
10:30 < mjschranz> mhoye: What was the bug?
10:30 < mhoye> Multi-user, multi-stack web services are the most horrible thing.
10:30 < mjschranz> mhoye: Sounds like a nightmare.
10:32 < mhoye> mjschranz: When we built this Firefox customizing thing, we built a pretty clever way to scan a directory tree and construct an .MSI file out of the result.
10:33 < mhoye> The software stack is a trail of tears. web -> php -> python -> wine -> mono (the .net emulator) -> emulated DOSSHELL commands  and then all the way back.

I promise you it’s necessary. I’ve stared at this stuff for a long time, and there’s no other way. I overlooked bash and MySQL in that list, too. Go, me.

10:34 < mhoye> And at one point, we decided to hardcode in some work directory names.
10:36 < mhoye> the MSI-building tools have this awful bug in them such that the absolute path of a Linux filename, as munged through that stack and turned into a dosshell filename, has to be less than 96 characters.
10:37 < mhoye> So we put everything in /tmp/m - Nice and short, and Firefox's tree is never so deep that we'll get close, so we should be OK.
10:37 < mhoye> But it's a multiuser system.
10:37 < mhoye> What happens if two people happen to try building stuff in /tmp/m at the same time?
10:38 < mhoye> I was staring at that code the other day, on the verge of giving accounts to clients when it dawned on me.

If there’s one thing I’m proud of here, it’s that I saw this in the code before a customer saw it on their screen.

Imagine you’re wrapping gifts for a departments store. The process is simple: you take the gift, you get a box, you put it in the box. You wrap the box, you hand it to the client. Nothing to it.

But imagine if two people are using the space to do that, at the same time; you’re both racing to see who gets to which step first, to see who gets to use that space. And because these two people aren’t people at all, but processes on a computer, they don’t just nod, wait and take turns of their own initiative; they just follow the process. And whoever “wins” the race, getting there first, gets their workspace flattened as the other process moves their own work into it.

You take a gift. The other guy takes a gift. Both of you find a box that fits your gift, and one of you put it on the table first. And then the second guy puts theirs on the table.

What happens then? Does the second box get put in the larger, or on the smaller, first one? Does it just get brushed aside?

And then it gets worse, because then the gifts get dropped into the mix. Maybe a too-large gift smushes the smaller box, maybe both get put in the same box, maybe the small one gets put in the larger box before being crushed under the larger gift. Maybe one of them got wrapped already, if one of the processes thinks it’s that far ahead, and then gets a new, unwrapped gift dropped on top of it.

Shortly whatever mess is left on that table gets handed by one of the processes to one of the clients. You can never be sure which one, or what state that package is in, but you’re virtually guaranteed that it’s not something the client wanted. And better yet, the other robot can then turn around and maybe wrap up a bunch of empty space where the gift used to be, and hand that empty wad of wrapper off to their customer. Assuming some third client hasn’t come along to use that space at the same time. Or maybe forty more clients.

That class of problem is what’s called a “race condition“, and they’re some of the most subtle, pernicious and difficult problems in any kind of process management, software or otherwise.

10:39 < mhoye> the _best case_ scenario there is that it all just goes to hell.
10:39 < mhoye> Everything breaks for no obvious reason.
10:39 < mhoye> That's the best case.
10:39 < mhoye> The worst case is that one of my clients is given _somebody else's_ .MSI.
10:43 < mhoye> The real problem turned out to be "finding everywhere we've hardcoded that directory name, and figuring out how to pass that variable around".
10:44 < mhoye> And it was hidden in some places that, in retrospect, seem obviously nuts.

I’m sure all of this would have been easier if I was smarter.

One of the worst was finding that we’d put “-om” in the arguments we pass to a decompression tool, which actually mean “output to directory m”, not “using optional feature om”. Instead of taking a few extra minutes to define a variable somwhere and build the command string. Stupid, stupid. We did it months ago, too; I looked right at it when we did, with both eyes, I know better than to do that, and I did it anyway. My eyes skittered over that line for hours without latching onto that, either; embarrassing.

10:47 < mhoye> Anyway, rookie mistake solved.
10:47 < mhoye> But the lesson here is that knowing whether or not 2+ people will be using a program at the same time is a profoundly important design decision.
10:48 <@humph> so true
10:50 < jbuck> mhoye: 0, 1 or infinity users :)
10:51 < mhoye> And nobody gets paid for zero, and it's rare to get paid for one, yeah.
10:58 < mhoye> Man.
10:59 < mhoye> Git commit and git push have never felt this good.

One more reminder that “works for me” isn’t worth all that much. I’m going to chalk this one up to experience, and man, I’m glad this week is nearly done.