Never Work In Theory, Spring 2023

February 27, 2023

Indulge me for a minute; I’d like to tell you about a conference I’m helping organize, and why. But first, I want to tell you a story about measuring things, and the tools we use to do that.

Specifically, I want to talk about thermometers.

Even though a rough understanding of basic principles of the tool we now call a thermometer are at least two thousand years old, for centuries the whole idea that you could measure temperature at all was fantastical. The entire idea was absurd; how could you possibly measure an experience as subjective and ethereal as temperature?

the Dalence Thermometer, one of the rare scientific apparatus' notable for closely resembling an angry man sneaking a bong rip.

Even though you could demonstrate the basic principles involved in ancient Greece with nothing more than glass tubes and a fire the question itself was nonsense, like asking how much a poem weighs, how much water you could pour out of a sunset.

It was more than 1600 years between the earliest known glass-tube demonstrations and Santorini Santorio‘s decision to put a ruler to the side of one of those glass tubes; it was most of a century after that before Carlo Renaldini went ahead and tried Christiaan Huygens‘ idea of measuring relative to the freezing and boiling points of water be used as the anchor points of a linear scale. (Sir Isaac Newton followed that up with a proposal that the increments of that gradient be “12”, a decision I’m glad we didn’t stick with. Andres Celcius’ idea was better.)

The first tools we’d recognize as “modern thermometers” – using mercury, one of those unfortunately-reasonable-at-the-time decisions that have had distressing long-term consequences – were invented by Farenheit in 1714. More tragically, he proposed the metric that bears his name, but: the tool worked, and if there’s one thing in tech that we all know and fear, it’s that there’s nothing quite as permanent as something temporary that works.

By 1900, Henry Bolton – author of “The Evolution Of The Thermometer, 1592-1743” – had described this long evolution as “encumbered with erroneous statements that have been reiterated with such dogmatism that they have received the false stamp of authority”, a phrase that a lot of us in tech, I suspect, find painfully familiar.

Today, of course, outside of the most extreme margins – things get pretty dicey down in the quantum froth around absolute zero and when your energy densities are way up past the plasmas – these questions are behind us. Thermometers are real, temperatures can be very precisely measured, and that has enabled a universe of new possibilities across physics and chemistry and through metallurgy to medicine to precision manufacturing, too many things to mention.

The practice of computation, as a field, is less than a century old. We sometimes measure things we can measure, usually the things that are easiest to measure, but at the intersection of humans and computers, the most important part of the exercise, this field is still deeply & dogmatically superstitious. The false stamps of authority are everywhere.

I mean, look at this. Look at it. Tell me that isn’t kabbalist occultism, delivered via PowerPoint.

This is where we are, but we can do better.

On Tuesday, April 25, and Wednesday, April 26, It Will Never Work in Theory is running our third live event: a set of lightning talks from leading software engineering researchers on immediate, actionable results from their work.

I want to introduce you to the people building the thermometers of modern software engineering.

Some of last year’s highlights include the introduction of novel techniques like Causal Fairness Testing, supercharging DB test suites with SQLancer and two approaches for debugging neural nets, and none of these are hypothetical future someday ideas. These are tools you can start using now. That’s the goal.

And it should be a lot of fun, I hope to see you there.

Never Work In Theory:

The event page: