blarg?

September 20, 2015

The Bourne Aesthetic

“The difference between something that can go wrong and something that can’t possibly go wrong is that when something that can’t possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.”

–Douglas Adams

I’ve been trying to get this from draft to published for almost six months now. I might edit it later but for now, what the hell. It’s about James Bond, Jason Bourne, old laptops, economies of scale, design innovation, pragmatism at the margins and an endless supply of breadsticks.

You’re in, right?

Bond was a character that people in his era could identify with:

Think about how that works in the post war era. The office dwelling accountant/lawyer/ad man/salesman has an expense account. This covers some lunches at counters with clients, or maybe a few nice dinners. He flirts with the secretaries and receptionists and sometimes sleeps with them. He travels on business, perhaps from his suburb into Chicago, or from Chicago to Cleveland, or San Francisco to LA. His office issues him a dictaphone (he can’t type) or perhaps a rolling display case for his wares. He has a work car, maybe an Oldsmobile 88 if he’s lucky, or a Ford Falcon if he’s not. He’s working his way up to the top, but isn’t quite ready for a management slot. He wears a suit, tie and hat every day to the office. If he’s doing well he buys this downtown at a specialty men’s store. If he’s merely average, he picks this up at Macy’s, or Sears if he’s really just a regular joe. If he gets sick his employer has a nice PPO insurance plan for him.

Now look at Bond. He has an expense account, which covers extravagant dinners and breakfasts at the finest 4 star hotels and restaurants. He travels on business, to exotic places like Istanbul, Tokyo and Paris. He takes advantage of the sexual revolution (while continuing to serve his imperialist/nationalist masters) by sleeping with random women in foreign locations. He gets issued cool stuff by the office– instead of a big dictaphone that he keeps on his desk, Bond has a tiny dictaphone that he carries around with him in his pocket! He has a work car — but it’s an Aston Martin with machine guns! He’s a star, with a license to kill, but not management. Management would be boring anyways, they stay in London while Bond gets to go abroad and sleep with beautiful women. Bond always wears a suit, but they’re custom tailored of the finest materials. If he gets hurt, he has some Royal Navy doctors to fix him right up.

In today’s world, that organization man who looked up to James Bond as a kind of avatar of his hopes and dreams, no longer exists.

Who is our generations James Bond? Jason Bourne. He can’t trust his employer, who demanded ultimate loyalty and gave nothing in return. In fact, his employer is outsourcing his work to a bunch of foreign contractors who presumably work for less and ask fewer questions. He’s given up his defined benefit pension (Bourne had a military one) for an individual retirement account (safe deposit box with gold/leeching off the gf in a country with a depressed currency). In fact his employer is going to use him up until he’s useless. He can’t trust anyone, other than a few friends he’s made on the way while backpacking around. Medical care? Well that’s DIY with stolen stuff, or he gets his friends to hook him up. What kinds of cars does he have? Well no more company car for sure, he’s on his own on that, probably some kind of import job. What about work tools? Bourne is on is own there too. Sure, work initially issued him a weapon, but after that he’s got to scrounge up whatever discount stuff he can find, even when it’s an antique. He has to do more with less. And finally, Bourne survives as a result of his high priced, specialized education. He can do things few people can do – fight multiple opponents, hotwire a car, tell which guy in a restaurant can handle himself, hotwire cars, speak multiple languages and duck a surveillance tail. Oh, and like the modern, (sub)urban professional, Bourne had to mortgage his entire future to get that education. They took everything he had, and promised that if he gave himself up to the System, in return the System would take care of him.

It turned out to be a lie.

We’re all Jason Bourne now.

posted by wuwei at 1:27 AM on July 7, 2010

I think about design a lot these days, and I realize that’s about as fatuous an opener as you’re likely to read this week so I’m going to ask you to bear with me.

If you’re already rolling out your “resigned disappointment” face: believe me, I totally understand. I suspect we’ve both dealt with That Guy Who Calls Himself A Designer at some point, that particular strain of self-aggrandizing flake who’s parlayed a youth full of disdain for people who just don’t understand them into a career full of evidence they don’t understand anyone else. My current job’s many bright spots are definitely brighter for his absence, and I wish the same for you. But if it helps you get past this oddly-shaped lump of a lede, feel free to imagine me setting a pair of Raybans down next to an ornamental scarf of some kind, sipping a coffee with organic soy ingredients and a meaningless but vaguely European name, writing “Helvetica?” in a Moleskine notebook and staring pensively into the middle distance. Does my carefully manicured stubble convey the precise measure of my insouciance? Perhaps it does; perhaps I’m gazing at some everyday object nearby, pausing to sigh before employing a small gesture to convey that no, no, it’s really nothing. Insouciance is a french word, by the way. Like café. You should look it up. I know you’ve never been to Europe, I can tell.

You see? You can really let your imagination run wild here. Take the time you need to work through it. Once you’ve shaken that image off – one of my colleagues delightfully calls those guys “dribble designers” – let’s get rolling.

I think about design a lot these days, and I realize that’s about as fatuous an opener as you’re likely to read this week so I’m going to ask you to bear with me.

Very slightly more specifically I’ve been thinking about Apple’s latest Macbook, some recent retrospeculation from Lenovo, “timeless” design, spy movies and the fact that the Olive Garden at one point had a culinary institute. I promise this all makes sense in my head. If you get all the way through this and it makes sense to you too then something on the inside of your head resembles something on the inside of mine, and you’ll have to come to your own terms with that. Namasté, though. For real.

There’s an idea called “gray man” in the security business that I find interesting. They teach people to dress unobtrusively. Chinos instead of combat pants, and if you really need the extra pockets, a better design conceals them. They assume, actually, that the bad guys will shoot all the guys wearing combat pants first, just to be sure. I don’t have that as a concern, but there’s something appealingly “low-drag” about gray man theory: reduced friction with one’s environment.

– William Gibson, being interviewed at Rawr Denim

At first glance the idea that an Olive Garden Culinary Institute should exist at all squats on the line between bewildering and ridiculous. They use maybe six ingredients, and those ingredients need to be sourced at industrial scale and reliably assembled by a 22-year-old with most of a high-school education and all of a vicious hangover. How much of a culinary institute can that possibly take? In fact, at some remove the Olive Garden looks less like a restaurant chain than a supply chain that produces endless breadsticks; there doesn’t seem to be a ton of innovation here. Sure, supply chains are hard. But pouring prefab pomodoro over premade pasta, probably not.

Even so, for a few years the Tuscan Culinary Institute was a real thing, one of the many farming estates in Tuscany that have been resurrected to the service of regional gastrotourism booked by the company for a few weeks a year. Successful managers of the Garden’s ersatz-italian assembly lines could enjoy Tuscany on a corporate reward junket, and at a first glance amused disdain for the whole idea would seem to be on point.

There’s another way to look at the Tuscan Culinary Institute, though, that makes it seem valuable and maybe even inspired.

One trite but underappreciated part of the modern mid-tier supply-chain-and-franchise engine is how widely accessible serviceable and even good (if not great or world-beating) stuff has become. Coffee snobs will sneer at Starbucks, but the truck-stop tar you could get before their ascendance was dramatically worse. If you’ve already tried both restaurants in a town too remote to to be worth their while, a decent bowl of pasta, a bottle of inoffensive red and a steady supply of garlic bread starts to look like a pretty good deal.

This is one of the rare bright lights of the otherwise dismal grind of the capitalist exercise, this democratization of “good enough”. The real role of the Tuscan Culinary institute was to give chefs and managers a look at an authentic, three-star Tuscan dining experience and then ask them: with what we have to hand at the tail end of this supply chain, the pasta, the pomodoro, the breadsticks and wine, how can we give our customers 75% of that experience for 15% the cost?

It would be easy to characterize this as some sort of corporate-capitalist co-option of a hacker’s pragmatism – a lot of people have – but I don’t think that’s the right thing, or at least not the whole picture. This is a kind of design, and like any design exercise – like any tangible expression of what design is – we’re really talking about the expression and codification of values.

I don’t think it’s an accident that all the computers I bought between about 1998 about 2008 are either still in service or will still turn on if I flip the switch, but everything I’ve bought since lasts two or three years before falling over. There’s nothing magic about old tech, to be sure: in fact, the understanding that stuff breaks is baked right into their design. That’s why they’re still running: because they can be fixed. And thanks to the unfettered joys of standard interfaces some them are better today, with faster drives and better screens, than any computer I could have bought then.

The Macbook is the antithesis of this, of course. That’s what happened in 2008; the Macbook Pro started shipping with a non-removable battery.

If you haven’t played with one Apple’s flagship Macbooks, they are incredible pieces of engineering. They weigh approximately nothing. Every part of them seems like some fundamental advance in engineering and materials science. The seams are perfect; everything that can be removed, everything you can carve off a laptop and still have a laptop left, is gone.

As a result, it’s completely atomic, almost totally unrepairable. If any part of it breaks you’re hosed.

“Most people make the mistake of thinking design is what it looks like. People think it’s this veneer – that the designers are handed this box and told, ‘Make it look good!’ That’s not what we think design is. It’s not just what it looks like and feels like. Design is how it works.” – Steve Jobs

This is true, kind of; it depends on what you believe your scope of responsibility is as a designer. The question of “how a device works” is a step removed from the question of “how does a person engage with this device”; our aforementioned designer-caricature aside, most of us get that. But far more important than that is the question of how the device helps that person engage the world. And that’s where this awful contradiction comes in, because whatever that device might be, the person will never be some static object, and the world is seven billion people swimming in a boiling froth of water, oil, guns, steel, race, sex, language, wisdom, secrets, hate, love, pain and TCP/IP.

Our time is finite, and entropy is relentless: knowing that, how long should somebody be responsible for their designs? Are you responsible for what becomes of what you’ve built, over the long term? Because if you have a better way to play the long game here than “be a huge pile of rocks” you should chisel it into something. Every other thing of any complexity, anything with two moving parts to rub together that’s still usable or exists at all today has these two qualities:

  1. It can be fixed, and
  2. When it breaks, somebody cares enough about it to fix it.

And that’s where minimalism that denies the complexity of the world, that lies to itself about entropy, starts feeling like willful blindness; design that’s a thin coat of paint over that device’s relationship with the world.

More to the point, this is why the soi-disant-designer snob we were (justly and correctly) ragging on at the beginning of this seemingly-interminable-but-it-finally-feels-like-we’re-getting-somewhere blog post comes across as such a douchebag. It’s not “minimalist” if you buy a new one every two years; it’s conspicuous consumption with chamfered edges. Strip away that veneer, that coat of paint, and there are the real values designer-guy and his venti decaf soy wankaccino hold dear.

Every day I feel a tiny bit more like I can’t really rely on something I can’t repair. Not just for environmentalism’s sake, not only for the peace of mind that standard screwdrivers and available source offers, but because tools designed by people who understand something might fall over are so much more likely to have built a way to stand them back up. This is why I got unreasonably excited by Lenovo’s retro-Thinkpad surveys, despite their recent experiments in throwing user security overboard wearing factory-installed cement boots. The prospect of a laptop with modern components that you can actually maintain, much less upgrade, has become a weird niche crank-hobbyist novelty somehow.

But if your long game is longer than your workweek or your support contract, this is what a total-cost-accounting of “reduced friction with your environment” looks like. It looks like not relying on the OEM, like DIY and scrounged parts and above all knowing that you’re not paralyzed if the rules change. It’s reduced friction with an uncertain future.

I have an enormous admiration for the work Apple does, I really do. But I spend a lot of time thinking about design now, not in terms of shapes and materials but in terms of the values and principles it embodies, and it’s painfully obvious when those values are either deeply compromised or (more typically) just not visible at all. I’ve often said that I wish that I could buy hardware fractionally as good from anyone else for any amount of money, but that’s not really true. As my own priorities make participating in Apple’s vision more and more uncomfortable, what I really want is for some other manufacturer to to show that kind of commitment to their own values and building hardware that expresses them. Even if I could get to (say) 75% of those values, if one of them was maintainability – if it could be fixed a bit at a time – I bet over the long term, it would come out to (say) 15% of the cost.

Late footnote: This post at War Is Boring is on point, talking about the effects of design at the operational and logistical levels.

October 29, 2014

Social Engineering

I gave this talk at FSOSS last week, in which I try to reclaim the term “Social Engineering”, so that it stops meaning “get the receptionist to give you their password” and starts meaning “Measuring community growth and turning that into processes and practices that work.”

I thought it went well, though listening to it I can see I’ve got a couple of verbal tics to work on. Gotta stop using ‘um’ and ‘right’ as punctuation.

February 18, 2013

That’s Too Much Machine For You

Filed under: awesome,documentation,future,interfaces,irc,linux,science,toys — mhoye @ 11:10 am

Keep This Area Clear

Man, how awful is it to see people broken by the realization that they are no longer young. Why are you being cantankerous, newly-old person? It’s totally OK not to be 17 or 23, things are still amazing! Kids are having fun! You may not really understand it, but just roll with it! The stuff you liked when you were 17 isn’t diminished by your creeping up on 40!

This has been making the rounds, a lazy, disappointing article from Wired about the things we supposedly “learned about hacking” from the 1995 almost-classic, Hackers. It’s a pretty unoriginal softball of an article, going for a few easy smirks by cherrypicking some characters’ sillier idiosyncrasies while making the author sound like his birthday landed on him like a cartoon piano.

We need a word for this whole genre of writing, where the author tries far too hard to convince you of his respectable-grownup-hood by burning down his youth. It’s hard to believe that in fifteen years the cycle won’t repeat itself, with this article being the one on the pyre; you can almost smell the smoke already, the odor of burning Brut and secret regrets.

The saddest part of the article, really, is how much it ignores. Which is to say: just about everything else. There’s plenty of meat to chew on there, so I don’t really understand why; presumably it has something to do with deadlines or clickthroughs or word-counts or column inches or something, whatever magic words the writers at Wired burble as they pantomime their editor’s demands and sob into their dwindling Zima stockpile.

I’ve got quite a soft spot in my heart and possibly also my brain for this movie, in part because it is flat-out amazing how many things Hackers got exactly right:

  • Most of the work involves sitting in immobile concentration, staring at a screen for hours trying to understand what’s going on? Check.
  • It’s usually an inside job from a disgruntled employee? Check.
  • A bunch of kids who don’t really understand how severe the consequences of what they’re up to can be, in it for kicks? Check.
  • Grepping otherwise-garbage swapfiles for security-sensitive information? Almost 20 years later most people still don’t get why that one’s a check, but my goodness: check.
  • Social-engineering for that one piece of information you can’t get otherwise, it works like a charm? Check.
  • Using your computer to watch a TV show you wouldn’t otherwise be able to? Golly, that sounds familiar.
  • Dumpster-diving for source printouts? I suspect that for most of my audience “line printers” fit in the same mental bucket as “coelecanth”, and printing anything at all, much less code, seems kind of silly and weird by now, so you’ll just have to take my word for it when I say: very much so, check.
  • A computer virus that can affect industrial control systems, causing a critical malfunction? I wonder where I’ve heard that recently.
  • Abusive prosecutorial overreach, right from the opening scene? You’d better believe, check.

So if you haven’t seen it, Hackers is a remarkable artefact of its time. It’s hardly perfect; the dialog is uneven, the invented slang aged as well as invented-slang always does. Moore’s Law has made anything with a number on the side look kind of quaint, and there’s plenty of that horrible neon-cars-on-neon-highways that directors seem to fall back on when they need to show you what the inside of a computer is doing. But really: Look at that list. Look at it.

For all its flaws, sure, Hackers may not be something you’d hold aloft as a classic. But it’s good fun and it gets an awful lot more right than wrong, and that’s not nothing.

December 14, 2012

Reading Glasses

Filed under: digital,documentation,interfaces,linux,toys,weird,work — mhoye @ 12:22 am

I’ll level with you: I’m not very good at reading code.

I had an interview the other day that featured the dreaded read-this-code segment that’s inevitable in modernity, and reading somebody else’s Python without context, with a regex or two thrown in for kicks… I know there are people who can do that really well, but man, I’m not one of them.

To try and atone for how that went, I’ve written a thing I’ve been meaning to get done for a while, a kind of high-level analysis tool for Git repositories that will be able to give you some suggestions based on historical commit information. It’s called gitcoach, and it’s over on github if you’re interested.

The idea is that it takes look at a project’s whole commit history to see what files tend to get modified at the same time and then looks at what you’re working on now; if you’re working on some file Foo, gitcoach can tell that hey, historically anyone who’s had to change Foo has also changed Bar 92% of the time, and Baz 80% of the time. So, no guarantees, but I suggest you look at those too.

There’s more you can do with that data, perhaps obviously – the nice thing about the general idea is that whenever I mention it to somebody, they think of some other thing you can do with that data that I hadn’t even considered.

So that’s something.

It’s not a finished product – there’s some known bugs and missing features listed in the README, and some others I’m sure that I don’t see yet. But there it is, and hopefully it will be useful for people trying to find their way around a big or new projects.

Sorry about the regex question, dude.

October 2, 2012

Hypothetical Laptop notes

Filed under: business,digital,interfaces,linux,want — mhoye @ 11:57 am

Let’s say you wanted to design a laptop, preinstalled with Linux and for a linux-user audience. This is mostly my own notes about what I’d like, but I wouldn’t mind some feedback.

  • Pixel Qi screen 14″ inches if possible, for power and daylight readability.
  • Arm SOC (Tegra?)
  • User-replaceable battery and RAM.
  • Ideally, off-the-shelf batteries. Are cellphone batteries now good enough that you could line up four of them to power a laptop? Could be, could be…
  • Casing that was meant to be disassembled, insofar as possible. Not junk, but not triwing-five-point-torx screws, either.
  • Built-in software-defined-radio-usable chips, two of them, and significant antenna.
  • Bluetooth 4, usb3. Wifi, obviously.
  • HDMI out. It’d be nice to know if a chip existed that could support HDMI-out and the Pixel Qi screens, that sounds like the best of both worlds.
  • Nonjunk touchpad.

What am I missing? Anything else?

August 25, 2012

Toolchain

Filed under: digital,documentation,fail,future,interfaces,linux,toys,vendetta — mhoye @ 9:15 pm

I was idly looking over the shooting script for Men In Black the other day. Different from the movie, in a lot of little ways that add up; as filmed it came out a fun, largely harmless sci-fi movie of no particular note but the original script was quietly darker and more introspective than you’d expect. As an example, the scene where Edwards (Will Smith) has been given until sun-up to decide if he’s in; he asks Jay (Tommy-Lee Jones) “Is it worth it?”

On screen, Jay replies “It is. If you’re strong enough” as he walks away. But on the page Jay’s explanation of the cost of signing up is a lot more personal.

EDWARDS – So what’s the catch?

KAY – What you’ll gain in perspective, you’ll lose in ways you’re too young to comprehend. You give up everything. Sever every human contact. No one will know you exist. Ever.

EDWARDS – Nobody?

KAY – You’re not even allowed a favorite shirt. There. That’s the speech I never heard. That’s the choice I never got.

EDWARDS – Hold up. You track me down, put me through those stupid-ass tests, now you’re trying to talk me out of it. I don’t get it.

KAY – You got ’til sun-up.

EDWARDS – Is it worth it?

KAY – You find out, you let me know.

Kev called me out yesterday, and he was not wrong.

Haters Gonna Tweet

I’m back to carrying around a Linux laptop these days, and all those old mixed feelings are still there. Power management is still really dodgy, the UI is a mess, lots of stuff that should Just Work by now either just doesn’t, or somehow stopped working for a while after a good run. Plug the wrong USB dingus into it, and if you close the lid it will try to cook itself; using it for day to day stuff isn’t wildly better than I remember it being a few years ago; I’m back to using it for reasons that I refer to as “principles”, stuff about information freedom and freedom-to vs. freedom-from, developing on the platform you’re deploying to, that sort of thing. They’re arguments that I can make pretty convincingly, even to myself sometimes, but there are days (like this one) that all that rhetoric seems like a thin veneer over some self-imposed variety of Stockholm Syndrome.

I have a paragraph here about how “It is better, the saying goes, to light a single candle than to curse the darkness”, and then goes on to something about being honest about your motivations, and maybe you’re lighting the candle to cover up the smell, not push back the dark. It’s not really coming together for me, but that’s the broad strokes of it.

You get the idea.

But here’s a thing: my brother sent me a PDF he needed to quote some passages from, the usual horrible “PDF-full-of-scanned-JPEGs” garbage you find everywhere. He was losing patience with it, and all the OCR software he could find was either insanely expensive or useless junk or both.

But I know that pdf2html will give me a numbered list of all those images, and after a few seconds of research with apt-cache, I found Tesseract-OCR, installed it and tested it against a small sample of pages to see if the output looked sane. It did; it doesn’t to output to anything but a file, but that’s fine. So a quick for i in `seq` do later, my laptop is quietly grinding out a text file a human can copy and paste from.

The good parts of life with Linux are like that. Rich scripting languages, an incredibly broad, deep and on-demand tools that have been hammered into shape for years or decades, job control that lets you just put it in the background (or on a server somewhere else) and let it run. Being able to chain tools together so that when you need something novel done you can just get it done, and not spend hours scavenging around the ‘net for that one piece of random crapware that does 70% of the job, it’s so great.

Linux on a laptop has a set of problems that have existed for years; I know I’m bringing a lot of this on myself. If I was using a desktop, a lot of these hardware hangups might just disappear, just by virtue of the fact that you never unplug a desktop, or put it to sleep. And I know what way the trends are going, here, too – the Free Software tools people care about seem to be finding their way over to OSX, with varying degrees of speed and elegance, but the rare design sensibilities that find their way back, seemingly via a cargo cult’s long march, don’t seem to be helping much. Bold efforts (like Gnome Shell) that sail less than perfectly polished run rapidly aground on the shoals of the Linux community’s profound fear of change. And as for hardware, well. Um.

But: Apt-get. Shell scripting. Pipe, job control. All the old, unglorious tools, so many of them rock solid and reliable; they are incredibly great. Being able to just have the tools you need, to test and experiment and process and have the computer doing stuff for you instead of having to muck around manually basically all the time, it is so good. Being able to see the right-out-there experimental stuff, like software radio, gestating; amazing. Macports and (God help you) Homebrew are a thin gruel in comparison, and Windows has nothing to compare it to.

I feel like I’m typing with mittens on when they’re not around, like I’ve looked into my toolbox and found one oversize novelty Tonka screwdriver and wrench made of Nerf. These are just the basic fundamental particles of getting anything done! How do people live without this stuff?

These days my everyday computing carry is ideologically polarized. My Macbook is mostly a desktop I can move to the couch, and I roll with a Thinkpad and an iPad. The small number of things I need to Just Work just work, and there’s room to (and yes, frequently, need to) tinker with the rest. I’m actually thinking about my next laptop being a linux on a non-Macbook, as crazy as that sounds. And if you’ve ever tried figuring out if a laptop runs linux correctly, well. You’ve read The Killing Joke, right?

So I don’t even know what direction that pendulum is swinging, now. I guess we’ll see again in a few years. I don’t know if it’s worth it; if I figure it out, I’ll let you know.

May 7, 2012

Lazyweb: SQL Diff

Filed under: digital,documentation,linux,work — mhoye @ 2:48 pm

I asked the lazyweb: What’s the preferred SQL diff tool? I’d like to take two SQL dumps and get back an SQL file of the difference.

Sheeri Cabral delivers the answer: if you do your DB dump with the –skip-extended-insert option, you can use regular old diff to get you most of the way there. That doesn’t give you an SQL file you can use directly, but it gets you enough of the way there that it’ll do.

January 26, 2012

A Short Course On The Tragedy In Act One

Filed under: digital,documentation,doom,fail,future,hate,interfaces,linux,vendetta,want — mhoye @ 12:35 pm

Back in 2003 Raymond Chen, noted Microsoftie and venerable author of the excellent Old New Thing blog, wrote a bit about the propensity programmers had for, and problems caused by, reverse-engineering Microsoft’s APIs and hooking into them in unapproved ways:

“For example, BOZOSLIVEHERE was originally the window procedure for the edit control, with the rather nondescript name of EditWndProc. Then some people who wanted to use the edit control window procedure decide that GetWindowLong(GWL_WNDPROC) was too much typing, so they linked to EditWndProc directly. Then when Windows 2.0 (I think) removed the need to export window procedures, we removed them all, only to find that programs stopped working. So we had to put them back, but they got goofy names as a way of scolding the programs that were doing these invalid things.”

He’s a pretty good writer, and this stuff makes for a good story, but this “too much typing” line is… uncharacteristically disingenuous of him; the other side of that story was told, to put it mildly, a little different.

The Microsoft of the day was the Microsoft that came to be known as the evil empire, and for good reason; the combination of a dominant market position, rapid growth across a growing number of markets and no compunction at all about using what their consulting and support arms had learned about your company to leverage their growth into your market segment was legitimate grounds for a healthy dose of fear.

If you wanted to sell software you used their compilers and their APIs to talk to their OS and you consulted their support when you had problems. So if they suddenly developed an interest in your market niche they had a pretty good idea what the shape of your business looked like already. And their ability to leverage that information was very real, so much so that Microsoft’s announcement that they had plans to eventually make a similar product was sometimes enough to run competitors out of business.

This era is where the term “FUD” comes from, also not for no reason.

Because Microsoft could, and would, run the full-court press on your market segment if they decided it was worth their while. Veterans of the technical wars of the day can vividly remember their surprise, walking through decompiled assembler to discover the reason their program’s performance was in the toilet was because going through the official, approved-for-general-consumption Win32 call meant nothing more or less than calling a delay loop before passing unchanged arguments into a private API. Not for any technical reason, but as a defensive posture; just to guarantee that you couldn’t build a product as well as Microsoft could on the off chance that they woke up one morning and decided they wanted your niche.

So it really wasn’t about how long it took to type “GetWindowLong(GWL_WNDPROC)”; it was often the fact that, if you had to call that or something like it thirty-two thousand times and didn’t run that hack, your customer’s 386SX would spend twenty unresponsive minutes off in the weeds instead of fifteen seconds. Chen’s stories about having to reverse-engineer and accommodate poor programmer behavior are epic, and technically brilliant stories to be sure, but you should remember to read them in this light – these weren’t stupid programmers crawling up an unprotected stack for no reason. The Microsoft of the era just wasn’t a trustworthy collaborator. And for all the incredible, very-nearly-miraculous, brilliant work they’ve done maintaining backwards compatibility for applications doing horrible things, they brought an awful lot of that burden on themselves.

It took a protracted antitrust investigation, the long tenacity of free software and rise of the Web (with Mozilla keeping that torch lit through some long, dark years), Apple and later the primacy of mobile to really push Microsoft to the margins of relevancy where they are today. They’re still huge, they’re not all that evil anymore and they legitimately make some great products, but nobody really cares. They’re not making much of a mark on the things people do care about these days, mostly the social and mobile spaces. People aren’t afraid of them anymore because what matters changed, and developers and customers largely moved on.

That was a long time coming, too. But it’s starting to look like somebody’s getting ready to pick up that ball and run with it. A challenger appears!

This is just one example, but it’s really been part of a trend recently, and a good one to point to: take a look at this web-based Angry Birds demo, if you can. You might not be able to – it doesn’t work in Firefox – but the thing is, everything in there runs just fine in Firefox. Google has just decided that it won’t; not for any technical reason – they check some webkit-only CSS shim, it works fine in Safari – but just to keep it from working in competing browsers. Classier still, through the magic of view-source you can see that indignity bundled up in a <div id=”roadblock”> tag, a name I’d like to think gave somebody a moment’s pause, but I doubt it.

Larry Page said, back in the day, that Google wouldn’t put their own results ahead of other people’s because that would be bad for users, but that statement is apparently no longer operative. Likewise this 2009 statement from Jonathan Rosenberg, Senior VP, Product Management about open technology and open information:

Open technology includes open source, meaning we release and actively support code that helps grow the Internet, and open standards, meaning we adhere to accepted standards and, if none exist, work to create standards that improve the entire Internet (and not just benefit Google). Open information means that when we have information about users we use it to provide something that is valuable to them, we are transparent about what information we have about them, and we give them ultimate control over their information.

I’m ready to believe there’s still a lot of people at Google who really believe in this, and I’m sure that inside Google HQ they still have that kool-aid on tap. But those people are clearly not the ones at the helm anymore, and that’s going to have some broad repercussions – people who are using Gmail pseudonymously, for example, are well-advised to start planning a defensive migration, because that day’s coming.

But God knows to where you’d migrate to. The lunatic thing is that if you want the relative privacy of pseudonymous communication the way, back in the bad old days, you might have wanted basic computing functionality – that is, without kowtowing to an arbitrary, vaguely menacing megacorporation with arbitrary, vaguely menacing policies about your data – we might be getting back to the point where you need to rack your own box and learn how to roll it all yourself.

Dear Googlers: We’ve done this. It sucked. It was awful, a decade of near-total technical stagnation. It was WinCE 5.0 and Office 2003 and OSes with eight-year lifecycles and fucking Flash being the only way to deploy a new UI and everything interesting and promising and new pushed to the margins and excluded so that one company’s crown jewels stayed safe. And we might do it again, and it could be a tragedy or a farce or probably a bit of both. Trying to be more Apple than Apple and more Facebook than Facebook just means you’re trying to be less Google every single day.

It’s amazing, it is flat out astonishing, how much of the future depends on Google being the company that you, once upon a time, believed it could be. And you can still get there. To borrow a phrase, I’m not saying it’s too late for you, but the longer you wait, the closer you get to being Too Late.

But you need to do good. Saying you’re not evil isn’t good enough.

January 20, 2012

On Hiring

Filed under: academia,doom,fail,interfaces,linux,losers,vendetta — mhoye @ 1:25 pm

I’ve decided that if I find out a job applicant has internet-bragged that they could implement some major (Kickstarter, Ebay, Etsy, Facebook, anything…) website’s functionality in a week with Ruby – and it’s always Ruby, lately – I’m going to give them an interview right away, just so I can ask them why they haven’t.

I’d never give them a job. I just want to watch them squirm when I ask the question.

January 19, 2012

Political Theory, Asymmetric Warfare & Batman Movies

Filed under: academia,documentation,interfaces,life,linux,work — mhoye @ 4:55 pm

I made this presentation to Seneca’s Free Software and Open Source Symposium last year; it is dreadfully embarassing, revealing mainly that I’m a terrible speaker who tells weak jokes, goes off into the weeds too often, rambles and says “um” far and away too much. This is just the voice track over my slides, which I’ll put up later this evening.

I’m sure the general outline of the presentation I’d like to have given is in there somewhere, but here you go.

« Newer PostsOlder Posts »

Powered by WordPress