blarg?

August 15, 2018

Time Dilation

Filed under: academic,digital,documentation,interfaces,lunacy,mozilla,science,work — mhoye @ 11:17 am


[ https://www.youtube.com/embed/JEpsKnWZrJ8 ]

I riffed on this a bit over at twitter some time ago; this has been sitting in the drafts folder for too long, and it’s incomplete, but I might as well get it out the door. Feel free to suggest additions or corrections if you’re so inclined.

You may have seen this list of latency numbers every programmer should know, and I trust we’ve all seen Grace Hopper’s classic description of a nanosecond at the top of this page, but I thought it might be a bit more accessible to talk about CPU-scale events in human-scale transactional terms. So: if a single CPU cycle on a modern computer was stretched out as long as one of our absurdly tedious human seconds, how long do other computing transactions take?

If a CPU cycle is 1 second long, then:

  • Getting data out of L1 cache is about the same as getting your data out of your wallet; about 3 seconds.
  • At 9 to 10 seconds, getting data from L2 cache is roughly like asking your friend across the table for it.
  • Fetching data from the L3 cache takes a bit longer – it’s roughly as fast as having an Olympic sprinter bring you your data from 400 meters away.
  • If your data is in RAM you can get it in about the time it takes to brew a pot of coffee; this is how long it would take a world-class athlete to run a mile to bring you your data, if they were running backwards.
  • If your data is on an SSD, though, you can have it six to eight days, equivalent to having it delivered from the far side of the continental U.S. by bicycle, about as fast as that has ever been done.
  • In comparison, platter disks are delivering your data by horse-drawn wagon, over the full length of the Oregon Trail. Something like six to twelve months, give or take.
  • Network transactions are interesting – platter disk performance is so poor that fetching data from your ISP’s local cache is often faster than getting it from your platter disks; at two to three months, your data is being delivered to New York from Beijing, via container ship and then truck.
  • In contrast, a packet requested from a server on the far side of an ocean might as well have been requested from the surface of the moon, at the dawn of the space program – about eight years, from the beginning of the Apollo program to Armstrong, Aldrin and Collin’s successful return to earth.
  • If your data is in a VM, things start to get difficult – a virtualized OS reboot takes about the same amount of time as has passed between the Renaissance and now, so you would need to ask Leonardo Da Vinci to secretly encode your information in one of his notebooks, and have Dan Brown somehow decode it for you in the present? I don’t know how reliable that guy is, so I hope you’re using ECC.
  • That’s all if things go well, of course: a network timeout is roughly comparable to the elapsed time between the dawn of the Sumerian Empire and the present day.
  • In the worst case, if a CPU cycle is 1 second, cold booting a racked server takes approximately all of recorded human history, from the earliest Indonesian cave paintings to now.

February 28, 2018

The Last Days Of 20A0

Filed under: documentation,doom,future,interfaces,lunacy,microfiction — mhoye @ 5:58 pm


Science International – What Will They Think Of Next

At first blush this is a statement on the crude reproductive character of mass culture.

But it also serves as a warning about the psychohistorical destruction to come, the stagnation after revolution, the failure to remix.

I need to write this down, because I forget things sometimes, and I think what I heard today was important. Not to me, the time for me or almost anyone else alive on Earth today to make a difference has passed, but someone, somewhere might be able to make something of this, or at least find it helpful, or something. Once I’m done, I’m going to seal it up in a pipe, coat it in wax, and chuck it into the ravine. Maybe someday someone will read this, and try to put things together. If they’re allowed to.

It’s happening again.

The Phantom Time Hypothesis, developed by Heribert Illig, proposes that error and falsification have radically distorted the historical record. In his analysis, we have dilated the course of true events, so that they appear to cover far greater lengths of time than in fact passed. The so-called dark ages, for example, only appear that way because those centuries were mere decades.

You can feel it, can’t you? The relentless immediacy of crisis over crisis, the yawning void the endless emergency is stretched taut to obscure. The soul-bending psychological trauma; even moments of optimism seem unfairly compressed, hyperdense self-referential memetic shards landing like cartoon anvils and sublimated into vapor by the meteoric heat of the Next Thing. The spiritual torniquet of the perpetually immediate present twisting tighter, fractions of degrees at a time.

The space: do we not all feel it? The space. It may be said that the consumer cultures of the 1980s and 1990s, successively exhorting us to embrace artifice and then soul-crushing blandness, were manufactured to “cure” the residual confusion and cultural inconsistency that resulted from the methods used to effect mankind’s collective psychic displacement. The hidden “space,” however, manifests itself in curious ways – the obsession with youth and physical condition in those born in the 1960s and 1970s; oddities in climate change data; the apparently freakish pace of economic change in what we believe now to be the 1980s; and so forth.

You can hear fragments of the past that remain, the warning signs engineered to survive their own absence singing the speed, the mass of this oncoming train to anyone foolish or optimistic enough (and is there a difference, at this remove?) to put an ear to the tracks. It’s happening again; here we are in the moments before the moment, and it can’t be an accident that those who seem most adept in this psychosocial twilight, deftly navigating unmoored in cold storms of this howling psychic gyre are people who’ve lost their anchors or thrown them overboard by choice in the name of some dark mirrored vision of liberty or mere expediency, in the long calm of the before. They’re just one more set of symptoms now, signs of symbols nested in symbols whose ultimate referents are burned to ash beneath them.

It is happening again.

But the problem is a real one, not a mere intellectual game. Because today we live in a society in which spurious realities are manufactured by the media, by governments, by big corporations, by religious groups, political groups — and the electronic hardware exists by which to deliver these pseudo-worlds right into the heads of the reader, the viewer, the listener. Sometimes when I watch my eleven-year-old daughter watch TV, I wonder what she is being taught. The problem of miscuing; consider that. A TV program produced for adults is viewed by a small child. Half of what is said and done in the TV drama is probably misunderstood by the child. Maybe it’s all misunderstood. And the thing is, Just how authentic is the information anyhow, even if the child correctly understood it? What is the relationship between the average TV situation comedy to reality?

What’s left but what’s next, the twisting, the tightening, the inevitable snap; the collective spasm, the absence that will pass for absolution. The last fracturing as the cabals of consensus and permitted history are ground into the microcults gnawing at the fraying edges of tomorrow’s interstitials, memetic remixes remixed as memetic merchandise and malformed memories. Veracity hitting the kalidoscopic crystal of the weaponized postmodern like a bird hitting a window.

It. Is. Happening. Again.

We can’t say we weren’t warned.

I don’t know if that man was crazy or not, but I think he was sane. As he was leaving, he said something about putting my house underwater. Please, don’t let them brush me away. Don’t let them hide us. Try and find more, I know there’s got to be more people who tried to leave something behind. Don’t let the world die in vain. Remember us.

We were here, and there was something here worth saving. There was such a thing as now, and we fought for it. We’ll leave the artifacts, hidden and codified as we have before, as best we’re able. Watch for them. Listen. You’ll be able to hear the Next Time, the shape and speed and mass of it approaching, and it may not be too late to throw it off the tracks. Reassemble this moment, rebuild who we were out of the hidden shards we’ve left. Hone yourselves to the gleaming edges you’ll need with the tools we’ve left you. Put your ear to the rails and listen.

No piece of information is superior to any other. Power lies in having them all on file and then finding the connections. There are always connections; you have only to want to find them.

We were here. This was real. Remember us.

May 5, 2017

Nerd-Cred Level-Up

Filed under: awesome,flickr,life,lunacy,weird — mhoye @ 9:13 am

P5052724

In 2007 I was an extra in an indie zombie movie called “Sunday Morning” that featured Ron Tarrant. Tarrant starred with Mark Slacke in a 2010 short called “The Painting In The House”, who in turn played a role in Cuba Gooding Jr.’s “Sacrifice”. Gooding, of course, played a role in A Few Good Men, as did Kevin Bacon.

Recently, I’ve co-authored a paper with Greg Wilson – “Do Software Developers Understand Open Source Licenses?” – principal authors are Daniel Almeida and Gail Murphy at UBC – that will be presented at ICPC 2017 later this year. Greg Wilson has previously co-authored a paper with Robert Sedgewick, who has co-authored a paper with Andrew Chi-Chih Yao, who has in turn co-authored a paper with Ronald L. Graham.

You can find all of Graham’s many collaborations with Paul Erdős, one of the most prolific mathematicians of the 20th century, on his homepage.

Which is all to say that I now have an Erdős-Bacon number of 9.

I’m unreasonably stoked about that for some reason.

March 24, 2017

Mechanized Capital

Construction at Woodbine Station

Elon Musk recently made the claim that humans “must merge with machines to remain relevant in an AI age”, and you can be forgiven if that doesn’t make a ton of sense to you. To fully buy into that nonsense, you need to take a step past drinking the singularity-flavored Effective Altruism kool-aid and start bobbing for biblical apples in it.

I’ll never pass up a chance to link to Warren Ellis’ NerdGod Delusion whenever this posturing about AI as an existential threat comes along:

The Singularity is the last trench of the religious impulse in the technocratic community. The Singularity has been denigrated as “The Rapture For Nerds,” and not without cause. It’s pretty much indivisible from the religious faith in describing the desire to be saved by something that isn’t there (or even the desire to be destroyed by something that isn’t there) and throws off no evidence of its ever intending to exist.

… but I think there’s more to this silliness than meets the rightly-jaundiced eye, particularly when we’re talking about far-future crypto-altruism as pitched by present-day billionaire industrialists.

Let me put this idea to you: one byproduct of processor in everything is that it has given rise to automators as a social class, one with their own class interests, distinct from both labor and management.

Marxist class theory – to pick one framing; there are a few that work here, and Marx is nothing if not quotable – admits the existence of management, but views it as a supervisory, quasi-enforcement role. I don’t want to get too far into the detail weeds there, because the most important part of management across pretty much all the theories of class is the shared understanding that they’re supervising humans.

To my knowledge, we don’t have much in the way of political or economic theory written up about automation. And, much like the fundamentally new types of power structures in which automators live and work, I suspect those people’s class interests are very different than those of your typical blue or white collar worker.

For example, the double-entry bookkeeping of automation is: an automator writes some code that lets a machine perform a task previously done by a human, or ten humans, or ten thousand humans, freeing those humans to… do what?

If you’re an automator, the answer to that is “write more code”. If you’re one of the people whose job has been automated away, it’s “starve”. Unless we have an answer for what happens to the humans displaced by automation, it’s clearly not some hypothetical future AI that’s going to destroy humanity. It’s mechanized capital.

Maybe smarter people than me see a solution to this that doesn’t result in widespread starvation and crushing poverty, but I only see one: an incremental and ongoing reduction in the supply of human labor. And in a sane society, that’s pretty straightforward; it means the progressive reduction of maximum hours in a workweek, women with control over their own bodies, a steadily rising minimum wage and a large, sustained investments in infrastructure and the arts. But for the most part we’re not in one of those societies.

Instead, what it’s likely to mean is much, much more of what we already have: terrified people giving away huge amounts of labor for free to barter with the machine. You get paid for a 35 hours week and work 80 because if you don’t the next person in line will and you’ll get zero. Nobody enforces anything like safety codes or labor laws, because once you step off that treadmill you go to the back of the queue, and a thousand people are lined up in front of you to get back on.

This is the reason I think this singularity-infected enlightened-altruism is so pernicious, and morally bankrupt; it gives powerful people a high-minded someday-reason to wash their hands of the real problems being suffered by real people today, problems that they’re often directly or indirectly responsible for. It’s a story that lets the people who could be making a difference today trade it in for a difference that might matter someday, in a future their sitting on their hands means we might not get to see.

It’s a new faith for people who think they’re otherwise much too evolved to believe in the Flying Spaghetti Monster or any other idiot back-brain cult you care to suggest.

Vernor Vinge, the originator of the term, is a scientist and novelist, and occupies an almost unique space. After all, the only other sf writer I can think of who invented a religion that is also a science-fiction fantasy is L Ron Hubbard.
– Warren Ellis, 2008

February 6, 2017

The Scope Of The Possible

Filed under: digital,future,interfaces,life,lunacy,mozilla,want,weird,work — mhoye @ 5:34 pm

IMG_20170126_070957

This is a rough draft; I haven’t given it much in the way of polish, and it kind of just trails off. But a friend of mine asked me what I think web browsers look like in 2025 and I promised I’d let that percolate for a bit and then tell him, so here we go. For whatever disclaimers like this are worth, I don’t have my hands on any of the product direction levers here, and as far as the orgchart’s concerned I am a leaf in the wind. This is just my own speculation.

I’m a big believer in Conway’s Law, but not in the sense that I’ve heard most people talk about it. I say “most people”, like I’m the lone heretic of some secret cabal that convenes once a month to discuss a jokey fifty year old observation about software architecture, I get that, but for now just play along. Maybe I am? If I am, and I’m not saying one way or another, between you and me we’d have an amazing secret handshake.

So: Conway’s Law isn’t anything fancier than the observation that software is a collaborative effort, so the shape of large piece of software will end up looking a lot like the orgchart or communication channels of the people building it; this emerges naturally from the need to communicate and coordinate efforts between teams.

My particular heresy here is that I don’t think Conway’s Law needs to be the curse it’s made out to be. Communication will never not be expensive, but it’s also a subset of interaction. So if you look at how the nature of people’s interactions with and expectations from a communication channel are changing, you can use it as a kind of oracle to predict what the next evolutionary step of a product should look like.

At the highest level, some 23 years after Netscape Navigator 1.0 came out, the way we interact with a browser is pretty much the same as it ever was; we open it, poke around it and close it. Sure, we poke around a lot more things, and they’re way cooler and have a lot more people on far end of them but… for the most part, that’s it.

That was all that you could do in the 90’s, because that’s pretty much all that interacting with the web of the 90’s could let you do. The nature of the Web has changed profoundly since then, and like I’ve said before, the web is everywhere and in everything now. But despite that, and the fact that browsers are very different beasts now than they were when the Web was taking its first tentative steps, that high-level interaction model has stayed pretty much the same.

But if the web is everywhere and in everything, then an interaction that involves opening an app, looking through it and closing it again seems incredibly antiquated, like you’re looking out a porthole in the side of a steamship. Even the name is telling: you don’t “browse” the web anymore. You engage with it, you interact with it, and with people, groups and businesses through it.

Another way to say that is the next generation of web browser won’t look like a browser at all: it will be a service.

More specifically I think the next generation of what we currently call a web browser will be a hybrid web-access service; like the current Web, it lives partly on a machine somewhere and partly on whatever device or devices are next to you, and act as the intermediary – the user agent – that keeps you connected you to this modern, always-on Web.

The app model is almost, kind-of-partway there, but in so many ways it makes life more complicated and less interesting than it needs to be. For the most part, apps only ever want to connect you to one place or set of people. Maybe that’s fine and that’s where your people are. But maybe you have to juggle a bunch of different communities in your life across a bunch of apps that go out of their way to keep those communities from discovering each other, and they all seem to want different slices of your life, your time and data depending on what the ad revenue people think is trendy this week. And because companies want to cover their bases you end up with these strange brands-pretending-to-be-people everywhere. It’s a mess, and having to juggle a bunch of different apps and communities doesn’t make a ton of sense when we’ve already got a reliable way of shipping safe, powerful software on demand.

I think the right – and probably next – thing is to push that complexity away from their device, to this user-agent-as-a-service living out there on a serverin the cloud somewhere, just sitting there patiently paying attention. Notifications – a superset of messaging, and the other part of this picture – can come from anywhere and be anything, because internet, but your Agent can decide whether forward them on directly, filter or bounce them, as you like. And if you decide to go out there and get something – a video, a file, a page, whatever, then your Agent can do all sorts of interesting work for you in-flight. Maybe you want ad filtering, maybe you paid for an antivirus service to give that file a once-over, maybe your employer has security protocols in place to add X or strip out Y. There’s lots of room there for competing notification services, agent providers and in-agent services, a marketplace of ideas-that-are-also-machines.

There’s a couple of things that browsers, for all their warts and dated ideas, do better than any app or monolithic service; most of those have to do with user intent, the desire for safety and privacy, but also the desires for novelty, variety and unique humanity. I’ve talked about this before, the idea of engineering freedom in depth. I still think it’s possible to build human-facing systems that can – without compromise – mitigate the possibility of harm, and mount a positive defense of the scope of the possible. And I think maybe this is one way to do that.

(Updated: Typos, phrasing, added some links.)

October 8, 2016

Pageant Knight

Filed under: a/b,awesome,comics,documentation,lunacy,microfiction,weird — mhoye @ 11:45 pm

Sunset At The Beach

On September 17th, DC “celebrated” what they called “Batman Day”. I do not deploy scare quotes lightly, so let me get this out of the way: Batman is boring. Batman qua Batman as a hero, as a story and as the center of a narrative framework, all of those choices are pretty terrible. The typical Batman story arc goes something like:

  • Batman is the best at everything. But Gotham, his city, is full of terrible.
  • Batman broods over his city. The city is full of terrible but Batman is a paragon of brooding justice.
  • An enemy of justice is scheming at something. Batman detects the scheme, because he is the World’s Greatest Among Many Other Things Detective and intervenes.
  • Batman is a paragon of brooding justice.
  • Batman’s attempt to intervene fails! Batman may not be the best at everything!
  • Batman broods and/or has a bunch of feelings and/or upgrades one of his widgets.
  • Batman intervenes again, and Batman emerges triumphant! The right kind of punching and/or widgeting makes him the best at everything again.
  • Order is restored to Gotham.
  • Batman is a paragon of brooding justice.

If you’re interested in telling interesting stories Batman is far and away the least interesting thing in Gotham. So I took that opportunity to talk about the Batman story I’d write given the chance. The root inspiration for all this was a bout of protracted synesthesia brought on by discovering this take on Batman from Aaron Diaz, creator of Dresden Codak, at about the same time as I first heard Shriekback’s “Amaryllis In The Sprawl”.

The central thesis is this: if you really want a Gritty, Realistic Batman For The Modern Age, then Gotham isn’t an amped-up New York. It’s an amped-up New Orleans, or some sort of New-Orleans/Baltimore mashup. A city that’s full of life, history, culture, corruption and, thanks to relentlessly-cut tax rates, failing social and physical infrastructure. A New-Orleans/Baltimore metropolis in a coastal version of Brownback’s Kansas, a Gotham where garbage isn’t being collected and basic fire & police services are by and large not happening because tax rates and tax enforcement has been cut to the bone and the city can’t afford to pay its employees.

Bruce Wayne, wealthy philanthropist and Gotham native, is here to help. But this is Bruce Wayne via late-stage Howard Hughes; incredibly rich, isolated, bipolar and delusional, a razor-sharp business mind offset by a crank’s self-inflicted beliefs about nutrition and psychology. In any other circumstances he’d be the harmless high-society crackpot city officials kept at arm’s length if they couldn’t get him committed. But these aren’t any other circumstances: Wayne is far more than just generous, but he wants to burn this candle at both ends by helping the city through the Wayne Foundation by day and in his own very special, very extralegal way, fighting crime dressed in a cowl by night.

And he’s so rich that despite his insistence on dressing up his 55-year-old self in a bat costume and beating people up at night, the city needs that money so badly that to keep his daytime philanthropy flowing, six nights a week a carefully selected group of city employees stage another episode of “Batman, crime fighter”, a gripping Potemkin-noir pageant with a happy ending and a costumed Wayne in the lead role.

Robin – a former Arkham psych-ward nurse, a gifted young woman and close-combat prodigy in Wayne’s eyes – is a part of the show, conscripted by Mayor Cobblepot to keep an eye on Wayne and keep him out of real trouble. Trained up by retired SAS Sgt. Alfred Pennyworth behind Wayne’s back, in long-shuttered facilities beneath Wayne Manor that Wayne knows nothing about, she is ostensibly Batman’s sidekick in his fight against crime. But her real job is to protect Wayne on those rare occasions that he runs into real criminals and tries to intervene. She’s got a long, silenced rifle under that cloak with a strange, wide-mouthed second barrel and a collection of exotic munitions that she uses like a surgical instrument, not only to protect Wayne but more importantly to keep him convinced his fists & gadgets work at all.

She and Harleen Quinzel, another ex-Arkham staffer trained by Alfred, spend most of their days planning strategy. They have the same job; Quinn is the sidekick, shepherd and bodyguard of the former chief medical officer of Arkham. Quinn’s charge is also in his twilight years, succumbing to a manic psychosis accelerated by desperate self-administration of experimental and off-label therapies that aren’t slowing the degeneration of his condition, but sure are making him unpredictable. But he was brilliant once, also a philanthropist – the medical patents he owns are worth millions, bequeathed to Gotham and the patients of Arkham, provided the city care for him in his decline. Sometimes he’s still lucid; the brilliant, compassionate doctor everyone remembers. And other times – mostly at night – he’s somebody else entirely, somebody with a grievance and a dark sense of humor.

So Gotham – this weird, mercenary, vicious, beautiful, destitute Gotham – becomes the backdrop for this nightly pageant of two damaged, failing old men’s game of cat and mouse and the real story we’re following is Robin, Quinn, Alfred and the weird desperation of a city so strapped it has to let them play it out, night after dark, miserable night.

September 2, 2016

Brought To You By The Letter U

Filed under: awesome,lunacy,microfiction,weird — mhoye @ 12:04 pm
Being a global organization, Mozilla employees periodically send out all-hands emails notifying people of upcoming regional holidays. With Labour Day coming up in Canada, this was my contribution to the cause:

The short version: Monday is Labour Day, a national holiday in Canada – expect Canadian offices to be closed and our Canadian colleagues to be either slow to respond or completely unresponsive, depending on how much fun they’ve had.

The longer version:

On Monday, Canadians will be celebrating Labour Day by not labouring; as many of you know, this is one of Canada’s National Contradictions, one of only two to appear on a calendar*.

Canada’s labour day has its origin in the Toronto Typographical Union’s strike for a 58-hour work-week in 1872, brought on by the demands of the British government for large quantities of the letter U. At the time, Us were aggressively recirculated to the British colonies to defend Imperial syntactic borders and maintain grammatical separation between British and American English. In fact, British grammarian propaganda from this period is the origin of the phrase “Us and Them”.

At the time, Canadian Us were widely recognized as the highest quality Us available, but the hard labour of the vowel miners and the artisans whose skill and patience made the Canadian Us the envy of western serifs is largely lost to history; few people today realize that “usability” once described something that would suffice in the absence of an authentic Canadian U.

Imperial demands placed on Union members at the time were severe. Indeed, in the weeks leading up to the 1872 strike the TTU twice had to surrender their private Us to make the imperial quota, and were known as the Toronto Typographical Onion in the weeks leading up to the strike. While success of the Onion’s strike dramatically improved working conditions for Canadian labourers, this was the beginning of a dramatic global U shortage; from 1873 until the late early 1900s, global demand for Us outstripped supply, and most Us had been refurbished and reused many times over; “see U around” was a common turn of phrase describing this difficult time.

Early attempts at meeting the high demand for U were only somewhat successful. In the 1940s the British “v for victory” campaign was only partially successful in addressing British syntactic shortages that were exacerbated by extensive shipping losses due to sunken U-boats. The Swedish invention of the umlaut – “u” meaning “u” and “mlaut” meaning “kinda” – intended to paper over the problem, was likewise unsuccessful. It wasn’t until the electronic typography of the late seventies that U demand could easily be fulfilled and words like Ubiquity could be typed casually, without the sense of “overuse” that had plagued authors for most of a century.

Despite a turnaround that lexical economists refer to as “The Great U-Turn”, the damage was done. Regardless of their long status as allies the syntactic gap between American and British Englishes was a bridge too far; anticipated American demand for Us never materialized, and American English remains unusual to this day.

Today, Labour Day is effectively a day Canada spends to manage, and indeed revel in the fact, that there are a lot of Us; travellers at this time of year will remark on the number of U-Hauls on the road, carting Us around the country in celebration. This is all to say that we’ll be celebrating our labour heritage this upcoming Monday. Canadians everywhere may be seen duing any number of thungs to commumurate this uccasiun: swumming, canuing, guardening, vusuting neighbours, and spunding tume at the couttage

Thunk you, und see you all un Tuesday.

– mhuye

* – The other being the Spring National Resignation, where Canadians repeatedly declare Hockey their national sport while secretly enjoying watching the Leafs choke away another promising start.

May 27, 2016

Developers Are The New Mainframes

Filed under: documentation,future,interfaces,lunacy,mozilla,science,weird,work — mhoye @ 3:20 pm

This is another one of those rambling braindump posts. I may come back for some fierce editing later, but in the meantime, here’s some light weekend lunacy. Good luck getting through it. I believe in you.

I said that thing in the title with a straight face the other day, and not without reason. Maybe not good reasons? I like the word “reason”, I like the little sleight-of-hand it does by conflating “I did this on purpose” and “I thought about this beforehand”. It may not surprise you to learn that in my life at least those two things are not the same at all. In any case this post by Moxie Marlinspike was rattling around in the back of my head when somebody asked me on IRC why it’s hard-and-probably-impossible to make a change to a website in-browser and send a meaningful diff back to the site’s author, so I rambled for a bit and ended up here.

This is something I’ve asked for in the past myself: something like dom-diff and dom-merge, so site users could share changes back with creators. All the “web frameworks” I’ve ever seen are meant to make development easier and more manageable but at the end of the day what goes over the wire is a pile of minified angle-bracket hamburger that has almost no connection the site “at rest” on the filesystem. The only way share a usable change with a site author, if it can be done at all, is to stand up a containerized version of the entire site and edit that. This disconnect between the scale of the change and the work needed to make it is, to put it mildly, a huge barrier to somebody who wants to correct a typo, tweak a color or add some alt-text to an image.

I ranted about this for a while, about how JavaScript has made classic View Source obsolete and how even if you had dom-diff and dom-merge you’d need a carefully designed JS framework underneath designed specifically to support them, and how it makes me sad that I don’t have the skill set or free time to make that happen. But I think that if you dig a little deeper, there are some cold economics underneath that whole state of affairs that are worth thinking about.

I think that the basic problem here is the misconception that federation is a feature of distributed systems. I’m pretty confident that it’s not; specifically, I believe that federated systems are a byproduct of computational scarcity.

Building and deploying federated systems has a bunch of hard tradeoffs around development, control and speed of iteration that people are stuck with when computation is so expensive that no single organization can have or do enough of it to give a service global reach. Usenet, XMPP, email and so forth were products of this mainframe-and-minicomputer era; the Web is the last and best of them.

Protocol consensus is hard, but not as hard or expensive as a room full of $40,000 or $4,000,000 computers, so you do that work and accept the fact that what you gain in distributed stability you lose in iteration speed and design flexibility. The nature of those costs means the pressure to get it pretty close to right on the first try is very high, because real opportunities to revisit will be rare and costly. You’re fighting your own established success at that point, and nothing in tech has more inertia than a status quo whose supporters think is good enough. (See also: how IPV6 has been “right around the corner” for 20 years.)

But that’s just not true anymore. If you need a few thousand more CPUs, you twiddle the dials on your S3 page and go back to unified deployment, rapid experimental iteration and trying to stay ahead of everyone else who’s doing the same. That’s how WhatsApp can deploy end to end encryption with one software update, just like that. It’s how Facebook can update a billion users’ experiences whenever they feel like it, and presumably how Twitter does whatever the hell Twitter’s doing this week. They don’t ask permission or seek consensus because they don’t have to; they deploy, test and iterate.

So the work that used to enable, support and improve federated systems now mostly exists where domain-computation is still scarce and expensive: the development process itself. Specifically the inside of developers heads, developers who stubbornly and despite our best efforts remain expensive, high-maintenance and relatively low-bandwidth, with lots of context and application-reasoning locked up in their heads and poorly distributed.

Which is to say: developers are the new mainframes.

Right now great majority of what they’re “connected” to from a development-on-device perspective are de-facto dumb terminals. Apps, iPads, Android phones. Web pages you can’t meaningfully modify for values of “meaningful” that involve upstreaming a diff. From a development perspective those are the endpoints of one-way transmissions, and there’s no way to duplex that line to receive development-effort back.

So, if that’s the trend – that is, if in general centralized-then-federated systems get reconsolidated in socially-oriented verticals, (and that’s what issue trackers are when compared to mailing lists) – then development as a practice is floating around the late middle step, but development as an end product – via cheap CPU and hackable IoT devices – that’s just getting warmed up. The obvious Next Thing in that space will be a resurgence of something like the Web, made of little things that make little decisions – effectively distributing, commodifying and democratizing programming as a product, duplexing development across those newly commodified development-nodes.

That’s the real revolution that’s coming, not the thousand-dollar juicers or the bluetooth nosehair trimmers, but the mess of tiny hackable devices that start to talk to each other via decentralized, ultracommodified feedback loops. We’re missing a few key components – bug trackers aren’t quite source-code-managers or social-ey, IoT build tools aren’t one-click-to-deploy and so forth, but eventually there will be a single standard for how these things communicate and run despite everyone’s ongoing efforts to force users into the current and very-mainframey vendor lock-in, the same way there were a bunch of proprietary transport protocols before TCP/IP settled the issue. Your smarter long-game players will be the ones betting on JavaScript to come out on top there, though it’s possible there will be other contenders.

The next step will be the social one, though “tribal” might be a better way of putting it – the eventual recentralization of this web of thing-code into cultural-preference islands making choices about how they speak to the world around them and the world speaks back. Basically a hardware scripting site with a social aspect built in, communities and trusted sources building social/subscriber model out for IoT agency. What the Web became and is still in a lot of ways becoming as we figure the hard part – the people at scale part, out. The Web of How Stuff Works.

Anyway, if you want to know what the next 15-20 years will look like, that’s the broad strokes. Probably more like 8-12, on reflection. Stuff moves pretty quick these days, but like I said, building consensus is hard. The hard part is always people. This is one of the reasons I think Mozilla’s mission is only going to get more important for the foreseeable future; the Web was the last and best of the federated systems, worth fighting for on those grounds alone, and we’re nowhere close to done learning everything it’s got to teach us about ourselves, each other and what it’s possible for us to become. It might be the last truly open, participatory system we get, ever. Consensus is hard and maybe not necessary anymore, so if we can’t keep the Web and the lessons we’ve learned and can still learn from it alive long enough to birth its descendants, we may never get a chance to build another system like it.

[minor edits since first publication. -mhoye]

December 5, 2015

Barbiephonic (redux)

Filed under: awesome,digital,doom,interfaces,lunacy,parenting,toys,vendetta — mhoye @ 9:51 pm

Structure

I have a funny story about the recent Hello Barbie networked-device security failure. This is doubly a repost – it started its current incarnation as a twitter rant, and longtime readers may remember it from the dim recesses of history, but the time has come for me to tell it again.

Back in 2007 Mattel had a site where they’d charge parents two bucks to have one of Mattel’s franchise characters give their child a real phone call, because people still did that in 2007. They’d let you hear the call before paying, which I suppose was good of them, but I poked around a bit and pretty quickly discovered that whatever company Mattel had hired for this was not so good with the infosec.

The subject of the calls – Dora would say it’s important to learn to read or help around the house, Barbie would tell you to work hard in school, that sort of thing – was pretty pedestrian, harmless despite the weirdly Reagan-era-esque Kid-Celebrities-Help-You-Just-Say-No-To-Drugs vibe. But the indexes on the folders storing all those component sound files they’d assemble into your custom call were wide open.

And the other thing lying around on those open shares were recordings of names. To reach a wide audience they’d recorded some unstoppably perky young woman reciting kids’ first names, Aaron, Abbot, Abby, Abigail, Adana, Adena, in an upbeat barbie-girl voice, every single one. And there I was with a pile of free disk space, university bandwidth, wget and why not.

There were seventeen thousand of them.

After a bit of experimentation, I figured out how to stitch them all together with .4 seconds of silence between each. The resulting audio file was almost five hours long; four hours and forty five minutes of relentless Barbiedoll voice reciting seventeen thousand first names in alphabetical order.

To my knowledge, nobody has ever listened to the whole thing.

Of the six attempts I’m aware of, four were called off when the death threats started, one due to the near-breakup of the couple making the attempt, and one person drinking themselves to unconsciousness at about the 90 minute mark. I’m not saying that to make a joke. I’m telling you because this is real and it’s an SCP-grade psychic biohazard. No highly esteemed deed was committed here; this is not a place of honour.

So don’t say I didn’t warn you.

For your listening pleasure: here it is.

Have a good weekend, Internet.

UPDATE: Somebody made a Youtube video.

November 15, 2015

The Thousand Year Roadmap

Filed under: academic,documentation,future,interfaces,lunacy,mozilla,work — mhoye @ 10:33 am

I made this presentation at Seneca’s FSOSS a few weeks ago; some of these ideas have been rattling around in my brain for a while, but it was the first time I’d even run through it. I was thoroughly caffeinated at the time so all of my worst verbal tics are on display, right as usual um right pause right um. But if you want to have my perspective on why free and open source software matters, why some institutions and ideas live and others die out, and how I think you should design and build organizations around your idea so that they last a few hundred years, here you go.

There are some mistakes I made, and now that I’m watching it – I meant to say “merchants” rather than “farmers”, there’s a handful of others I may come back here to note later. But I’m still reasonably happy with it.

Older Posts »

Powered by WordPress