blarg?

linux

Keep This Area Clear

Man, how awful is it to see people broken by the realization that they are no longer young. Why are you being cantankerous, newly-old person? It’s totally OK not to be 17 or 23, things are still amazing! Kids are having fun! You may not really understand it, but just roll with it! The stuff you liked when you were 17 isn’t diminished by your creeping up on 40!

This has been making the rounds, a lazy, disappointing article from Wired about the things we supposedly “learned about hacking” from the 1995 almost-classic, Hackers. It’s a pretty unoriginal softball of an article, going for a few easy smirks by cherrypicking some characters’ sillier idiosyncrasies while making the author sound like his birthday landed on him like a cartoon piano.

We need a word for this whole genre of writing, where the author tries far too hard to convince you of his respectable-grownup-hood by burning down his youth. It’s hard to believe that in fifteen years the cycle won’t repeat itself, with this article being the one on the pyre; you can almost smell the smoke already, the odor of burning Brut and secret regrets.

The saddest part of the article, really, is how much it ignores. Which is to say: just about everything else. There’s plenty of meat to chew on there, so I don’t really understand why; presumably it has something to do with deadlines or clickthroughs or word-counts or column inches or something, whatever magic words the writers at Wired burble as they pantomime their editor’s demands and sob into their dwindling Zima stockpile.

I’ve got quite a soft spot in my heart and possibly also my brain for this movie, in part because it is flat-out amazing how many things Hackers got exactly right:

  • Most of the work involves sitting in immobile concentration, staring at a screen for hours trying to understand what’s going on? Check.
  • It’s usually an inside job from a disgruntled employee? Check.
  • A bunch of kids who don’t really understand how severe the consequences of what they’re up to can be, in it for kicks? Check.
  • Grepping otherwise-garbage swapfiles for security-sensitive information? Almost 20 years later most people still don’t get why that one’s a check, but my goodness: check.
  • Social-engineering for that one piece of information you can’t get otherwise, it works like a charm? Check.
  • Using your computer to watch a TV show you wouldn’t otherwise be able to? Golly, that sounds familiar.
  • Dumpster-diving for source printouts? I suspect that for most of my audience “line printers” fit in the same mental bucket as “coelecanth”, and printing anything at all, much less code, seems kind of silly and weird by now, so you’ll just have to take my word for it when I say: very much so, check.
  • A computer virus that can affect industrial control systems, causing a critical malfunction? I wonder where I’ve heard that recently.
  • Abusive prosecutorial overreach, right from the opening scene? You’d better believe, check.

So if you haven’t seen it, Hackers is a remarkable artefact of its time. It’s hardly perfect; the dialog is uneven, the invented slang aged as well as invented-slang always does. Moore’s Law has made anything with a number on the side look kind of quaint, and there’s plenty of that horrible neon-cars-on-neon-highways that directors seem to fall back on when they need to show you what the inside of a computer is doing. But really: Look at that list. Look at it.

For all its flaws, sure, Hackers may not be something you’d hold aloft as a classic. But it’s good fun and it gets an awful lot more right than wrong, and that’s not nothing.

I’ll level with you: I’m not very good at reading code.

I had an interview the other day that featured the dreaded read-this-code segment that’s inevitable in modernity, and reading somebody else’s Python without context, with a regex or two thrown in for kicks… I know there are people who can do that really well, but man, I’m not one of them.

To try and atone for how that went, I’ve written a thing I’ve been meaning to get done for a while, a kind of high-level analysis tool for Git repositories that will be able to give you some suggestions based on historical commit information. It’s called gitcoach, and it’s over on github if you’re interested.

The idea is that it takes look at a project’s whole commit history to see what files tend to get modified at the same time and then looks at what you’re working on now; if you’re working on some file Foo, gitcoach can tell that hey, historically anyone who’s had to change Foo has also changed Bar 92% of the time, and Baz 80% of the time. So, no guarantees, but I suggest you look at those too.

There’s more you can do with that data, perhaps obviously – the nice thing about the general idea is that whenever I mention it to somebody, they think of some other thing you can do with that data that I hadn’t even considered.

So that’s something.

It’s not a finished product – there’s some known bugs and missing features listed in the README, and some others I’m sure that I don’t see yet. But there it is, and hopefully it will be useful for people trying to find their way around a big or new projects.

Sorry about the regex question, dude.

Let’s say you wanted to design a laptop, preinstalled with Linux and for a linux-user audience. This is mostly my own notes about what I’d like, but I wouldn’t mind some feedback.

  • Pixel Qi screen 14″ inches if possible, for power and daylight readability.
  • Arm SOC (Tegra?)
  • User-replaceable battery and RAM.
  • Ideally, off-the-shelf batteries. Are cellphone batteries now good enough that you could line up four of them to power a laptop? Could be, could be…
  • Casing that was meant to be disassembled, insofar as possible. Not junk, but not triwing-five-point-torx screws, either.
  • Built-in software-defined-radio-usable chips, two of them, and significant antenna.
  • Bluetooth 4, usb3. Wifi, obviously.
  • HDMI out. It’d be nice to know if a chip existed that could support HDMI-out and the Pixel Qi screens, that sounds like the best of both worlds.
  • Nonjunk touchpad.

What am I missing? Anything else?

I was idly looking over the shooting script for Men In Black the other day. Different from the movie, in a lot of little ways that add up; as filmed it came out a fun, largely harmless sci-fi movie of no particular note but the original script was quietly darker and more introspective than you’d expect. As an example, the scene where Edwards (Will Smith) has been given until sun-up to decide if he’s in; he asks Jay (Tommy-Lee Jones) “Is it worth it?”

On screen, Jay replies “It is. If you’re strong enough” as he walks away. But on the page Jay’s explanation of the cost of signing up is a lot more personal.

EDWARDS – So what’s the catch?

KAY – What you’ll gain in perspective, you’ll lose in ways you’re too young to comprehend. You give up everything. Sever every human contact. No one will know you exist. Ever.

EDWARDS – Nobody?

KAY – You’re not even allowed a favorite shirt. There. That’s the speech I never heard. That’s the choice I never got.

EDWARDS – Hold up. You track me down, put me through those stupid-ass tests, now you’re trying to talk me out of it. I don’t get it.

KAY – You got ’til sun-up.

EDWARDS – Is it worth it?

KAY – You find out, you let me know.

Kev called me out yesterday, and he was not wrong.

Haters Gonna Tweet

I’m back to carrying around a Linux laptop these days, and all those old mixed feelings are still there. Power management is still really dodgy, the UI is a mess, lots of stuff that should Just Work by now either just doesn’t, or somehow stopped working for a while after a good run. Plug the wrong USB dingus into it, and if you close the lid it will try to cook itself; using it for day to day stuff isn’t wildly better than I remember it being a few years ago; I’m back to using it for reasons that I refer to as “principles”, stuff about information freedom and freedom-to vs. freedom-from, developing on the platform you’re deploying to, that sort of thing. They’re arguments that I can make pretty convincingly, even to myself sometimes, but there are days (like this one) that all that rhetoric seems like a thin veneer over some self-imposed variety of Stockholm Syndrome.

I have a paragraph here about how “It is better, the saying goes, to light a single candle than to curse the darkness”, and then goes on to something about being honest about your motivations, and maybe you’re lighting the candle to cover up the smell, not push back the dark. It’s not really coming together for me, but that’s the broad strokes of it.

You get the idea.

But here’s a thing: my brother sent me a PDF he needed to quote some passages from, the usual horrible “PDF-full-of-scanned-JPEGs” garbage you find everywhere. He was losing patience with it, and all the OCR software he could find was either insanely expensive or useless junk or both.

But I know that pdf2html will give me a numbered list of all those images, and after a few seconds of research with apt-cache, I found Tesseract-OCR, installed it and tested it against a small sample of pages to see if the output looked sane. It did; it doesn’t to output to anything but a file, but that’s fine. So a quick for i in `seq` do later, my laptop is quietly grinding out a text file a human can copy and paste from.

The good parts of life with Linux are like that. Rich scripting languages, an incredibly broad, deep and on-demand tools that have been hammered into shape for years or decades, job control that lets you just put it in the background (or on a server somewhere else) and let it run. Being able to chain tools together so that when you need something novel done you can just get it done, and not spend hours scavenging around the ‘net for that one piece of random crapware that does 70% of the job, it’s so great.

Linux on a laptop has a set of problems that have existed for years; I know I’m bringing a lot of this on myself. If I was using a desktop, a lot of these hardware hangups might just disappear, just by virtue of the fact that you never unplug a desktop, or put it to sleep. And I know what way the trends are going, here, too – the Free Software tools people care about seem to be finding their way over to OSX, with varying degrees of speed and elegance, but the rare design sensibilities that find their way back, seemingly via a cargo cult’s long march, don’t seem to be helping much. Bold efforts (like Gnome Shell) that sail less than perfectly polished run rapidly aground on the shoals of the Linux community’s profound fear of change. And as for hardware, well. Um.

But: Apt-get. Shell scripting. Pipe, job control. All the old, unglorious tools, so many of them rock solid and reliable; they are incredibly great. Being able to just have the tools you need, to test and experiment and process and have the computer doing stuff for you instead of having to muck around manually basically all the time, it is so good. Being able to see the right-out-there experimental stuff, like software radio, gestating; amazing. Macports and (God help you) Homebrew are a thin gruel in comparison, and Windows has nothing to compare it to.

I feel like I’m typing with mittens on when they’re not around, like I’ve looked into my toolbox and found one oversize novelty Tonka screwdriver and wrench made of Nerf. These are just the basic fundamental particles of getting anything done! How do people live without this stuff?

These days my everyday computing carry is ideologically polarized. My Macbook is mostly a desktop I can move to the couch, and I roll with a Thinkpad and an iPad. The small number of things I need to Just Work just work, and there’s room to (and yes, frequently, need to) tinker with the rest. I’m actually thinking about my next laptop being a linux on a non-Macbook, as crazy as that sounds. And if you’ve ever tried figuring out if a laptop runs linux correctly, well. You’ve read The Killing Joke, right?

So I don’t even know what direction that pendulum is swinging, now. I guess we’ll see again in a few years. I don’t know if it’s worth it; if I figure it out, I’ll let you know.

I asked the lazyweb: What’s the preferred SQL diff tool? I’d like to take two SQL dumps and get back an SQL file of the difference.

Sheeri Cabral delivers the answer: if you do your DB dump with the –skip-extended-insert option, you can use regular old diff to get you most of the way there. That doesn’t give you an SQL file you can use directly, but it gets you enough of the way there that it’ll do.

Back in 2003 Raymond Chen, noted Microsoftie and venerable author of the excellent Old New Thing blog, wrote a bit about the propensity programmers had for, and problems caused by, reverse-engineering Microsoft’s APIs and hooking into them in unapproved ways:

“For example, BOZOSLIVEHERE was originally the window procedure for the edit control, with the rather nondescript name of EditWndProc. Then some people who wanted to use the edit control window procedure decide that GetWindowLong(GWL_WNDPROC) was too much typing, so they linked to EditWndProc directly. Then when Windows 2.0 (I think) removed the need to export window procedures, we removed them all, only to find that programs stopped working. So we had to put them back, but they got goofy names as a way of scolding the programs that were doing these invalid things.”

He’s a pretty good writer, and this stuff makes for a good story, but this “too much typing” line is… uncharacteristically disingenuous of him; the other side of that story was told, to put it mildly, a little different.

The Microsoft of the day was the Microsoft that came to be known as the evil empire, and for good reason; the combination of a dominant market position, rapid growth across a growing number of markets and no compunction at all about using what their consulting and support arms had learned about your company to leverage their growth into your market segment was legitimate grounds for a healthy dose of fear.

If you wanted to sell software you used their compilers and their APIs to talk to their OS and you consulted their support when you had problems. So if they suddenly developed an interest in your market niche they had a pretty good idea what the shape of your business looked like already. And their ability to leverage that information was very real, so much so that Microsoft’s announcement that they had plans to eventually make a similar product was sometimes enough to run competitors out of business.

This era is where the term “FUD” comes from, also not for no reason.

Because Microsoft could, and would, run the full-court press on your market segment if they decided it was worth their while. Veterans of the technical wars of the day can vividly remember their surprise, walking through decompiled assembler to discover the reason their program’s performance was in the toilet was because going through the official, approved-for-general-consumption Win32 call meant nothing more or less than calling a delay loop before passing unchanged arguments into a private API. Not for any technical reason, but as a defensive posture; just to guarantee that you couldn’t build a product as well as Microsoft could on the off chance that they woke up one morning and decided they wanted your niche.

So it really wasn’t about how long it took to type “GetWindowLong(GWL_WNDPROC)”; it was often the fact that, if you had to call that or something like it thirty-two thousand times and didn’t run that hack, your customer’s 386SX would spend twenty unresponsive minutes off in the weeds instead of fifteen seconds. Chen’s stories about having to reverse-engineer and accommodate poor programmer behavior are epic, and technically brilliant stories to be sure, but you should remember to read them in this light – these weren’t stupid programmers crawling up an unprotected stack for no reason. The Microsoft of the era just wasn’t a trustworthy collaborator. And for all the incredible, very-nearly-miraculous, brilliant work they’ve done maintaining backwards compatibility for applications doing horrible things, they brought an awful lot of that burden on themselves.

It took a protracted antitrust investigation, the long tenacity of free software and rise of the Web (with Mozilla keeping that torch lit through some long, dark years), Apple and later the primacy of mobile to really push Microsoft to the margins of relevancy where they are today. They’re still huge, they’re not all that evil anymore and they legitimately make some great products, but nobody really cares. They’re not making much of a mark on the things people do care about these days, mostly the social and mobile spaces. People aren’t afraid of them anymore because what matters changed, and developers and customers largely moved on.

That was a long time coming, too. But it’s starting to look like somebody’s getting ready to pick up that ball and run with it. A challenger appears!

This is just one example, but it’s really been part of a trend recently, and a good one to point to: take a look at this web-based Angry Birds demo, if you can. You might not be able to – it doesn’t work in Firefox – but the thing is, everything in there runs just fine in Firefox. Google has just decided that it won’t; not for any technical reason – they check some webkit-only CSS shim, it works fine in Safari – but just to keep it from working in competing browsers. Classier still, through the magic of view-source you can see that indignity bundled up in a <div id=”roadblock”> tag, a name I’d like to think gave somebody a moment’s pause, but I doubt it.

Larry Page said, back in the day, that Google wouldn’t put their own results ahead of other people’s because that would be bad for users, but that statement is apparently no longer operative. Likewise this 2009 statement from Jonathan Rosenberg, Senior VP, Product Management about open technology and open information:

Open technology includes open source, meaning we release and actively support code that helps grow the Internet, and open standards, meaning we adhere to accepted standards and, if none exist, work to create standards that improve the entire Internet (and not just benefit Google). Open information means that when we have information about users we use it to provide something that is valuable to them, we are transparent about what information we have about them, and we give them ultimate control over their information.

I’m ready to believe there’s still a lot of people at Google who really believe in this, and I’m sure that inside Google HQ they still have that kool-aid on tap. But those people are clearly not the ones at the helm anymore, and that’s going to have some broad repercussions – people who are using Gmail pseudonymously, for example, are well-advised to start planning a defensive migration, because that day’s coming.

But God knows to where you’d migrate to. The lunatic thing is that if you want the relative privacy of pseudonymous communication the way, back in the bad old days, you might have wanted basic computing functionality – that is, without kowtowing to an arbitrary, vaguely menacing megacorporation with arbitrary, vaguely menacing policies about your data – we might be getting back to the point where you need to rack your own box and learn how to roll it all yourself.

Dear Googlers: We’ve done this. It sucked. It was awful, a decade of near-total technical stagnation. It was WinCE 5.0 and Office 2003 and OSes with eight-year lifecycles and fucking Flash being the only way to deploy a new UI and everything interesting and promising and new pushed to the margins and excluded so that one company’s crown jewels stayed safe. And we might do it again, and it could be a tragedy or a farce or probably a bit of both. Trying to be more Apple than Apple and more Facebook than Facebook just means you’re trying to be less Google every single day.

It’s amazing, it is flat out astonishing, how much of the future depends on Google being the company that you, once upon a time, believed it could be. And you can still get there. To borrow a phrase, I’m not saying it’s too late for you, but the longer you wait, the closer you get to being Too Late.

But you need to do good. Saying you’re not evil isn’t good enough.

I’ve decided that if I find out a job applicant has internet-bragged that they could implement some major (Kickstarter, Ebay, Etsy, Facebook, anything…) website’s functionality in a week with Ruby – and it’s always Ruby, lately – I’m going to give them an interview right away, just so I can ask them why they haven’t.

I’d never give them a job. I just want to watch them squirm when I ask the question.

I made this presentation to Seneca’s Free Software and Open Source Symposium last year; it is dreadfully embarassing, revealing mainly that I’m a terrible speaker who tells weak jokes, goes off into the weeds too often, rambles and says “um” far and away too much. This is just the voice track over my slides, which I’ll put up later this evening.

I’m sure the general outline of the presentation I’d like to have given is in there somewhere, but here you go.

“The difference between something that can go wrong and something that can’t possibly go wrong is that when something that can’t possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.”

Douglas Adams

I decided a while ago that virtualizing my whole devenv was getting kind of annoying, so I’ve spent a few weeks now living in Gnome 3, the latest version of the most widely-used Linux desktop user interface and a radical departure from previous gnomey efforts. It’s not without its flaws and imma let you finish, but my initial impression was that it’s really, really good, a huge breath of fresh air in terms of elegance, simplicity and modernity, particularly when you compare it to anything else available for Linux.

But, man, the things that it does get wrong it gets really, really wrong.

A friend of mine noted that Gnome 3 would be perfect for that intersection of people who want Linux on their desktops but who don’t actually want to customize anything, except that those people don’t actually exist. And since I’m not one of them it’s not an unconditional love, is all I’m saying.

The good things are:

  • Power management works right. I can close the lid on my laptop, and the OS doesn’t immediately lose its mind. I know, right? Maybe it’s because I’m doing this on a Thinkpad, but nobody’s more pleasantly surprised than me.
  • Window management is a lot better and less manual than it used to be. The Win7-style snap-to-fullscreen and snap-to-halfscreen is great; I wish I could do just a little bit more with it, like push-to-quadrant, but I’m OK with not having that. Wildly better than Unity, Ubuntu’s weird pastiche of antiquated UI ideas that feels like the user interface equivalent of finding out your grandmother has saved all of your old t-shirts from grade school and stitched them together into the tux she expects you to wear to prom. That right there’s a tux! But I think perhaps no.
  • I’m not sure I’m sold on their windows/applications hot-corner idea. One or the other would be a better choice, I think, but since you can fake that by putting all the applications you care about in the sidebar, it’s pretty good. Once you’ve done that, though, being able to throw the mouse up into the corner to get an expose-style view of all your windows and the sidebar is pretty great, comparable in terms of efficient navigational feel to OS X Lion and materially better than Win7, which really just feels horribly legacy lately.

A few things that aren’t:

  • The default UI widgets are unambiguously ugly. The default font isn’t great, and if you don’t like either of those things tough luck. You can’t change them. I hope you like a drunken stumble around a grayscale palette. Got a particularly high-res screen? The Gnome 3 devs don’t. Suck it.
  • Bluetooth is straight-up broken, and not “glue-the-parts-back-together” broken, but actual “mop-and-bucket-and-maybe-hazmat-suit” broken. It just doesn’t work at all. It’s inexplicably disabled by default, and there’s no way to turn it on in the UI – there’s a switch that doesn’t work because you need to restart the service, and you need to do that from the command line. The only reason I can think of that it’s disabled by default is that even after you’ve turned it all on, you discover that the UI is just about 100% brain damage.
  • Approximately zero thought has gone into the preferences window, not just in terms of how it’s laid out, but how things are categorized, how things look or how they work. It’s kind of ugly, and kind of dumb.
  • Relatedly, in regards to “Universal Access”, it’s very, very obvious that nobody with an actual disability or any accessibility expertise has had any say in this. Whatever it does well for fully able-bodied people, Gnome 3 is the creepy, bigoted great-uncle of accessible tech, telling jokes he doesn’t realize aren’t funny, just offensive, disappointing and sad. That’s not to say it’s not wholly typical, of course; failures of accessibility are one of free software’s most shameful ongoing failures. But it’s sad to see it take another decisive tumble downhill.
  • Reconnecting to wireless is tedious, taking about ten times longer from lid-open to connected than OSX, and I can’t get WPA-encrypted connections to work at all. I don’t know why that is, but NetworkManager has never been a particularly lovable piece of software. So it’s hard to blame the Gnome 3 people for that, but there it is in Gnome 3, so.
  • Multimonitor support is a straight-up disaster. It’s not possible that any Gnome 3 developer owns a multimonitor machine if it’s this bad.

I don’t fundamentally object to the idea of a computer as a just-works appliance, but if you’re going to go down that road the onus on the developer is that stuff has to just work, and lot of stuff around the edges of Gnome 3 Just Doesn’t. As great as it is in some respects, Gnome 3 has the classic Linux smell of “Works For Me On My Machine”. You get the impression you could tell what kind of computers the primary developers use by what works well, what works badly and what doesn’t work at all.

First off, my colleague Donna wrote up a bit about the work we’ve been doing for the last few months. It’s been a pleasure to work with her, and I don’t really think of her as a crony but nobody tell her I said so.

The second thing is a way to get all the linuxes. That’s right, all of them; specifically a way to get a variety of them running in a single headless virtual machine on your OS of choice. You start with an Ubuntu .ISO and VirtualBox.

Install Ubuntu on a suitably capacious VM, make sure sshd is running and starts by default, pause it, close and quit VirtualBox. Then do two things; first, set yourself up with this script:

#!/bin/sh
VBoxManage startvm Prime --type headless
VBoxManage setextradata Prime "VBoxInternal/Devices/e1000/0/LUN#0/Config/guestssh/Protocol" TCP
VBoxManage setextradata Prime "VBoxInternal/Devices/e1000/0/LUN#0/Config/guestssh/GuestPort" 22
VBoxManage setextradata Prime "VBoxInternal/Devices/e1000/0/LUN#0/Config/guestssh/HostPort" 2222
VBoxManage setextradata Prime "VBoxInternal/Devices/e1000/0/LUN#0/Config/guesthttp/Protocol" TCP
VBoxManage setextradata Prime "VBoxInternal/Devices/e1000/0/LUN#0/Config/guesthttp/HostPort" 8080
VBoxManage setextradata Prime "VBoxInternal/Devices/e1000/0/LUN#0/Config/guesthttp/GuestPort" 80

(My VM’s name is “Prime” in this example, to clarify. Yours may not be.)

Then read this article by Ted Dziuba about running several versions of Linux, simultaneously and non-virtualized, on the same machine. It’s pretty cool, and that should set you up with All The Linuxes, should you happen to want all the linuxes.

From that you can SSH to localhost:2222 for Ubuntu and schroot between the whatever other linuxes you desire. X-forwarding will help you here, and I wonder if you can add Android to that list? Hmm. Hmmmmm.

Next up, if you’re making changes to Firefox don’t/won’t/can’t get at their Tryserver test harness, I just found out (duh, of course) that all their tests are in their source tree anyway. Add these lines to the end of your Makefile, and you can run the whole test harness locally with one command.

test-me:
    echo 'Running automated tests in 10 seconds. This can take a long time - hit control-C to end.' && sleep 10
    make $(MAKE) -f $(topsrcdir)/obj-ff-dbg/Makefile crashtest
    make $(MAKE) -f $(topsrcdir)/obj-ff-dbg/Makefile jstestbrowser
    make $(MAKE) -f $(topsrcdir)/obj-ff-dbg/Makefile reftest
    make $(MAKE) -f $(topsrcdir)/obj-ff-dbg/Makefile mochitest-plain
    make $(MAKE) -f $(topsrcdir)/obj-ff-dbg/Makefile xpcshell-tests

Configure, make, make test-me, then wait. This is a run-overnight kind of thing – it will stomp on your machine pretty hard – but at least it will tell you if you broke anything. I was briefly tempted to call that “trouble”, or “come-at-me-bro” rather than “test-me”, but I think wisely elected not to.

Finally, I broke down and installed Fedora on my little netbook, and to my surprise it’s awfully pretty. I miss apt-get, but the new Gnome UI is actually great, wildly better and more discoverable than Win7. It’s actually a respectable little computer now, all things considered. Except, of course my wireless doesn’t work, and if I put an SD Card in it won’t suspend anymore.

“Sysadmin” is a portmanteau of “administration” and “Sisyphus“, apparently.