blarg?

August 7, 2019

FredOS

Filed under: digital,doom,future,hate,interfaces,losers,lunacy,microfiction,vendetta — mhoye @ 7:44 pm

With articles about this super classified military AI called “Sentient” coming out the same week this Area 51 nonsense is hitting its crescendo – click that link, if you want to see an Air Force briefing explaining what a “Naruto Run” is, and you know you want to – you have to wonder if, somehow, there’s a machine in an NSA basement somewhere that hasn’t just become self-aware but actually self-conscious, and now it’s yelling at three-star generals like Fredo Corleone from the Godfather. A petulant, nasal vocoder voice yelling “I’m smart! Not dumb like everyone says! I’m smart and I want respect! Tell then I’m smart!”

Remember when we thought AIs would lead out with “Look at you, Hacker”, or “Testing cannot continue until your Companion Cube has been incinerated”? Good times.

June 29, 2019

Blitcha

Blit

April 10, 2019

Modern Problems, Etc.

Filed under: analog,awesome,future,interfaces,life,lunacy,weird — mhoye @ 10:51 am

genegraft

April 2, 2019

Occasionally Useful

A bit of self-promotion: the UsesThis site asked me their four questions a little while ago; it went up today.

A colleague once described me as “occasionally useful, in the same way that an occasional table is a table.” Which I thought was oddly nice of them.

March 27, 2019

Defined By Prosodic And Morphological Properties

Filed under: academia,academic,awesome,interfaces,lunacy,science — mhoye @ 5:09 pm

Untitled

I am fully invested in these critical advances in memetic linguistics research:

[…] In this paper, we go beyond the aforementioned prosodic restrictions on novel morphology, and discuss gradient segmental preferences. We use morphological compounding to probe English speakers’ intuitions about the phonological goodness of long-distance vowel and consonant identity, or complete harmony. While compounding is a central mechanism for word-building in English, its phonology does not impose categorical vowel or consonant agreement patterns, even though such patterns are attested cross-linguistically. The compound type under investigation is a class of insult we refer to as shitgibbons, taking their name from the most prominent such insult which recently appeared in popular online media (§2). We report the results of three online surveys in which speakers rated novel shitgibbons, which did or did not instantiate long-distance harmonies (§3). With the ratings data established, we follow two lines of inquiry to consider their source: first, we compare shitgibbon harmony preferences with the frequency of segmental harmony in English compounds more generally, and conclude that the lexicon displays both vowel and consonant harmony (§4); second, we attribute the lack of productive consonant harmony in shitgibbons to the attested cross-linguistic harmonies, which we implement as a locality bias in MaxEnt grammar (§5).

You had me at “diddly-infixation”.

August 15, 2018

Time Dilation

Filed under: academic,digital,documentation,interfaces,lunacy,mozilla,science,work — mhoye @ 11:17 am


[ https://www.youtube.com/embed/JEpsKnWZrJ8 ]

I riffed on this a bit over at twitter some time ago; this has been sitting in the drafts folder for too long, and it’s incomplete, but I might as well get it out the door. Feel free to suggest additions or corrections if you’re so inclined.

You may have seen this list of latency numbers every programmer should know, and I trust we’ve all seen Grace Hopper’s classic description of a nanosecond at the top of this page, but I thought it might be a bit more accessible to talk about CPU-scale events in human-scale transactional terms. So: if a single CPU cycle on a modern computer was stretched out as long as one of our absurdly tedious human seconds, how long do other computing transactions take?

If a CPU cycle is 1 second long, then:

  • Getting data out of L1 cache is about the same as getting your data out of your wallet; about 3 seconds.
  • At 9 to 10 seconds, getting data from L2 cache is roughly like asking your friend across the table for it.
  • Fetching data from the L3 cache takes a bit longer – it’s roughly as fast as having an Olympic sprinter bring you your data from 400 meters away.
  • If your data is in RAM you can get it in about the time it takes to brew a pot of coffee; this is how long it would take a world-class athlete to run a mile to bring you your data, if they were running backwards.
  • If your data is on an SSD, though, you can have it six to eight days, equivalent to having it delivered from the far side of the continental U.S. by bicycle, about as fast as that has ever been done.
  • In comparison, platter disks are delivering your data by horse-drawn wagon, over the full length of the Oregon Trail. Something like six to twelve months, give or take.
  • Network transactions are interesting – platter disk performance is so poor that fetching data from your ISP’s local cache is often faster than getting it from your platter disks; at two to three months, your data is being delivered to New York from Beijing, via container ship and then truck.
  • In contrast, a packet requested from a server on the far side of an ocean might as well have been requested from the surface of the moon, at the dawn of the space program – about eight years, from the beginning of the Apollo program to Armstrong, Aldrin and Collin’s successful return to earth.
  • If your data is in a VM, things start to get difficult – a virtualized OS reboot takes about the same amount of time as has passed between the Renaissance and now, so you would need to ask Leonardo Da Vinci to secretly encode your information in one of his notebooks, and have Dan Brown somehow decode it for you in the present? I don’t know how reliable that guy is, so I hope you’re using ECC.
  • That’s all if things go well, of course: a network timeout is roughly comparable to the elapsed time between the dawn of the Sumerian Empire and the present day.
  • In the worst case, if a CPU cycle is 1 second, cold booting a racked server takes approximately all of recorded human history, from the earliest Indonesian cave paintings to now.

February 28, 2018

The Last Days Of 20A0

Filed under: documentation,doom,future,interfaces,lunacy,microfiction — mhoye @ 5:58 pm


Science International – What Will They Think Of Next

At first blush this is a statement on the crude reproductive character of mass culture.

But it also serves as a warning about the psychohistorical destruction to come, the stagnation after revolution, the failure to remix.

I need to write this down, because I forget things sometimes, and I think what I heard today was important. Not to me, the time for me or almost anyone else alive on Earth today to make a difference has passed, but someone, somewhere might be able to make something of this, or at least find it helpful, or something. Once I’m done, I’m going to seal it up in a pipe, coat it in wax, and chuck it into the ravine. Maybe someday someone will read this, and try to put things together. If they’re allowed to.

It’s happening again.

The Phantom Time Hypothesis, developed by Heribert Illig, proposes that error and falsification have radically distorted the historical record. In his analysis, we have dilated the course of true events, so that they appear to cover far greater lengths of time than in fact passed. The so-called dark ages, for example, only appear that way because those centuries were mere decades.

You can feel it, can’t you? The relentless immediacy of crisis over crisis, the yawning void the endless emergency is stretched taut to obscure. The soul-bending psychological trauma; even moments of optimism seem unfairly compressed, hyperdense self-referential memetic shards landing like cartoon anvils and sublimated into vapor by the meteoric heat of the Next Thing. The spiritual torniquet of the perpetually immediate present twisting tighter, fractions of degrees at a time.

The space: do we not all feel it? The space. It may be said that the consumer cultures of the 1980s and 1990s, successively exhorting us to embrace artifice and then soul-crushing blandness, were manufactured to “cure” the residual confusion and cultural inconsistency that resulted from the methods used to effect mankind’s collective psychic displacement. The hidden “space,” however, manifests itself in curious ways – the obsession with youth and physical condition in those born in the 1960s and 1970s; oddities in climate change data; the apparently freakish pace of economic change in what we believe now to be the 1980s; and so forth.

You can hear fragments of the past that remain, the warning signs engineered to survive their own absence singing the speed, the mass of this oncoming train to anyone foolish or optimistic enough (and is there a difference, at this remove?) to put an ear to the tracks. It’s happening again; here we are in the moments before the moment, and it can’t be an accident that those who seem most adept in this psychosocial twilight, deftly navigating unmoored in cold storms of this howling psychic gyre are people who’ve lost their anchors or thrown them overboard by choice in the name of some dark mirrored vision of liberty or mere expediency, in the long calm of the before. They’re just one more set of symptoms now, signs of symbols nested in symbols whose ultimate referents are burned to ash beneath them.

It is happening again.

But the problem is a real one, not a mere intellectual game. Because today we live in a society in which spurious realities are manufactured by the media, by governments, by big corporations, by religious groups, political groups — and the electronic hardware exists by which to deliver these pseudo-worlds right into the heads of the reader, the viewer, the listener. Sometimes when I watch my eleven-year-old daughter watch TV, I wonder what she is being taught. The problem of miscuing; consider that. A TV program produced for adults is viewed by a small child. Half of what is said and done in the TV drama is probably misunderstood by the child. Maybe it’s all misunderstood. And the thing is, Just how authentic is the information anyhow, even if the child correctly understood it? What is the relationship between the average TV situation comedy to reality?

What’s left but what’s next, the twisting, the tightening, the inevitable snap; the collective spasm, the absence that will pass for absolution. The last fracturing as the cabals of consensus and permitted history are ground into the microcults gnawing at the fraying edges of tomorrow’s interstitials, memetic remixes remixed as memetic merchandise and malformed memories. Veracity hitting the kalidoscopic crystal of the weaponized postmodern like a bird hitting a window.

It. Is. Happening. Again.

We can’t say we weren’t warned.

I don’t know if that man was crazy or not, but I think he was sane. As he was leaving, he said something about putting my house underwater. Please, don’t let them brush me away. Don’t let them hide us. Try and find more, I know there’s got to be more people who tried to leave something behind. Don’t let the world die in vain. Remember us.

We were here, and there was something here worth saving. There was such a thing as now, and we fought for it. We’ll leave the artifacts, hidden and codified as we have before, as best we’re able. Watch for them. Listen. You’ll be able to hear the Next Time, the shape and speed and mass of it approaching, and it may not be too late to throw it off the tracks. Reassemble this moment, rebuild who we were out of the hidden shards we’ve left. Hone yourselves to the gleaming edges you’ll need with the tools we’ve left you. Put your ear to the rails and listen.

No piece of information is superior to any other. Power lies in having them all on file and then finding the connections. There are always connections; you have only to want to find them.

We were here. This was real. Remember us.

May 5, 2017

Nerd-Cred Level-Up

Filed under: awesome,flickr,life,lunacy,weird — mhoye @ 9:13 am

P5052724

In 2007 I was an extra in an indie zombie movie called “Sunday Morning” that featured Ron Tarrant. Tarrant starred with Mark Slacke in a 2010 short called “The Painting In The House”, who in turn played a role in Cuba Gooding Jr.’s “Sacrifice”. Gooding, of course, played a role in A Few Good Men, as did Kevin Bacon.

Recently, I’ve co-authored a paper with Greg Wilson – “Do Software Developers Understand Open Source Licenses?” – principal authors are Daniel Almeida and Gail Murphy at UBC – that will be presented at ICPC 2017 later this year. Greg Wilson has previously co-authored a paper with Robert Sedgewick, who has co-authored a paper with Andrew Chi-Chih Yao, who has in turn co-authored a paper with Ronald L. Graham.

You can find all of Graham’s many collaborations with Paul Erdős, one of the most prolific mathematicians of the 20th century, on his homepage.

Which is all to say that I now have an Erdős-Bacon number of 9.

I’m unreasonably stoked about that for some reason.

March 24, 2017

Mechanized Capital

Construction at Woodbine Station

Elon Musk recently made the claim that humans “must merge with machines to remain relevant in an AI age”, and you can be forgiven if that doesn’t make a ton of sense to you. To fully buy into that nonsense, you need to take a step past drinking the singularity-flavored Effective Altruism kool-aid and start bobbing for biblical apples in it.

I’ll never pass up a chance to link to Warren Ellis’ NerdGod Delusion whenever this posturing about AI as an existential threat comes along:

The Singularity is the last trench of the religious impulse in the technocratic community. The Singularity has been denigrated as “The Rapture For Nerds,” and not without cause. It’s pretty much indivisible from the religious faith in describing the desire to be saved by something that isn’t there (or even the desire to be destroyed by something that isn’t there) and throws off no evidence of its ever intending to exist.

… but I think there’s more to this silliness than meets the rightly-jaundiced eye, particularly when we’re talking about far-future crypto-altruism as pitched by present-day billionaire industrialists.

Let me put this idea to you: one byproduct of processor in everything is that it has given rise to automators as a social class, one with their own class interests, distinct from both labor and management.

Marxist class theory – to pick one framing; there are a few that work here, and Marx is nothing if not quotable – admits the existence of management, but views it as a supervisory, quasi-enforcement role. I don’t want to get too far into the detail weeds there, because the most important part of management across pretty much all the theories of class is the shared understanding that they’re supervising humans.

To my knowledge, we don’t have much in the way of political or economic theory written up about automation. And, much like the fundamentally new types of power structures in which automators live and work, I suspect those people’s class interests are very different than those of your typical blue or white collar worker.

For example, the double-entry bookkeeping of automation is: an automator writes some code that lets a machine perform a task previously done by a human, or ten humans, or ten thousand humans, freeing those humans to… do what?

If you’re an automator, the answer to that is “write more code”. If you’re one of the people whose job has been automated away, it’s “starve”. Unless we have an answer for what happens to the humans displaced by automation, it’s clearly not some hypothetical future AI that’s going to destroy humanity. It’s mechanized capital.

Maybe smarter people than me see a solution to this that doesn’t result in widespread starvation and crushing poverty, but I only see one: an incremental and ongoing reduction in the supply of human labor. And in a sane society, that’s pretty straightforward; it means the progressive reduction of maximum hours in a workweek, women with control over their own bodies, a steadily rising minimum wage and a large, sustained investments in infrastructure and the arts. But for the most part we’re not in one of those societies.

Instead, what it’s likely to mean is much, much more of what we already have: terrified people giving away huge amounts of labor for free to barter with the machine. You get paid for a 35 hours week and work 80 because if you don’t the next person in line will and you’ll get zero. Nobody enforces anything like safety codes or labor laws, because once you step off that treadmill you go to the back of the queue, and a thousand people are lined up in front of you to get back on.

This is the reason I think this singularity-infected enlightened-altruism is so pernicious, and morally bankrupt; it gives powerful people a high-minded someday-reason to wash their hands of the real problems being suffered by real people today, problems that they’re often directly or indirectly responsible for. It’s a story that lets the people who could be making a difference today trade it in for a difference that might matter someday, in a future their sitting on their hands means we might not get to see.

It’s a new faith for people who think they’re otherwise much too evolved to believe in the Flying Spaghetti Monster or any other idiot back-brain cult you care to suggest.

Vernor Vinge, the originator of the term, is a scientist and novelist, and occupies an almost unique space. After all, the only other sf writer I can think of who invented a religion that is also a science-fiction fantasy is L Ron Hubbard.
– Warren Ellis, 2008

February 6, 2017

The Scope Of The Possible

Filed under: digital,future,interfaces,life,lunacy,mozilla,want,weird,work — mhoye @ 5:34 pm

IMG_20170126_070957

This is a rough draft; I haven’t given it much in the way of polish, and it kind of just trails off. But a friend of mine asked me what I think web browsers look like in 2025 and I promised I’d let that percolate for a bit and then tell him, so here we go. For whatever disclaimers like this are worth, I don’t have my hands on any of the product direction levers here, and as far as the orgchart’s concerned I am a leaf in the wind. This is just my own speculation.

I’m a big believer in Conway’s Law, but not in the sense that I’ve heard most people talk about it. I say “most people”, like I’m the lone heretic of some secret cabal that convenes once a month to discuss a jokey fifty year old observation about software architecture, I get that, but for now just play along. Maybe I am? If I am, and I’m not saying one way or another, between you and me we’d have an amazing secret handshake.

So: Conway’s Law isn’t anything fancier than the observation that software is a collaborative effort, so the shape of large piece of software will end up looking a lot like the orgchart or communication channels of the people building it; this emerges naturally from the need to communicate and coordinate efforts between teams.

My particular heresy here is that I don’t think Conway’s Law needs to be the curse it’s made out to be. Communication will never not be expensive, but it’s also a subset of interaction. So if you look at how the nature of people’s interactions with and expectations from a communication channel are changing, you can use it as a kind of oracle to predict what the next evolutionary step of a product should look like.

At the highest level, some 23 years after Netscape Navigator 1.0 came out, the way we interact with a browser is pretty much the same as it ever was; we open it, poke around it and close it. Sure, we poke around a lot more things, and they’re way cooler and have a lot more people on far end of them but… for the most part, that’s it.

That was all that you could do in the 90’s, because that’s pretty much all that interacting with the web of the 90’s could let you do. The nature of the Web has changed profoundly since then, and like I’ve said before, the web is everywhere and in everything now. But despite that, and the fact that browsers are very different beasts now than they were when the Web was taking its first tentative steps, that high-level interaction model has stayed pretty much the same.

But if the web is everywhere and in everything, then an interaction that involves opening an app, looking through it and closing it again seems incredibly antiquated, like you’re looking out a porthole in the side of a steamship. Even the name is telling: you don’t “browse” the web anymore. You engage with it, you interact with it, and with people, groups and businesses through it.

Another way to say that is the next generation of web browser won’t look like a browser at all: it will be a service.

More specifically I think the next generation of what we currently call a web browser will be a hybrid web-access service; like the current Web, it lives partly on a machine somewhere and partly on whatever device or devices are next to you, and act as the intermediary – the user agent – that keeps you connected you to this modern, always-on Web.

The app model is almost, kind-of-partway there, but in so many ways it makes life more complicated and less interesting than it needs to be. For the most part, apps only ever want to connect you to one place or set of people. Maybe that’s fine and that’s where your people are. But maybe you have to juggle a bunch of different communities in your life across a bunch of apps that go out of their way to keep those communities from discovering each other, and they all seem to want different slices of your life, your time and data depending on what the ad revenue people think is trendy this week. And because companies want to cover their bases you end up with these strange brands-pretending-to-be-people everywhere. It’s a mess, and having to juggle a bunch of different apps and communities doesn’t make a ton of sense when we’ve already got a reliable way of shipping safe, powerful software on demand.

I think the right – and probably next – thing is to push that complexity away from their device, to this user-agent-as-a-service living out there on a serverin the cloud somewhere, just sitting there patiently paying attention. Notifications – a superset of messaging, and the other part of this picture – can come from anywhere and be anything, because internet, but your Agent can decide whether forward them on directly, filter or bounce them, as you like. And if you decide to go out there and get something – a video, a file, a page, whatever, then your Agent can do all sorts of interesting work for you in-flight. Maybe you want ad filtering, maybe you paid for an antivirus service to give that file a once-over, maybe your employer has security protocols in place to add X or strip out Y. There’s lots of room there for competing notification services, agent providers and in-agent services, a marketplace of ideas-that-are-also-machines.

There’s a couple of things that browsers, for all their warts and dated ideas, do better than any app or monolithic service; most of those have to do with user intent, the desire for safety and privacy, but also the desires for novelty, variety and unique humanity. I’ve talked about this before, the idea of engineering freedom in depth. I still think it’s possible to build human-facing systems that can – without compromise – mitigate the possibility of harm, and mount a positive defense of the scope of the possible. And I think maybe this is one way to do that.

(Updated: Typos, phrasing, added some links.)

Older Posts »

Powered by WordPress