ongoing by Tim Bray

ongoing fragmented essay by Tim Bray

Spring Flowers, 2021 11 Apr 2021, 7:00 pm

48 hours ago I got my first Covid-19 vaccine dose, and today I took the camera for a stroll, hunting spring flowers. What a long strange trip it’s been.

Timothy Bray was vaccinated April 9th, 2021

Should I be concerned that the drugstore guy didn’t bother to sign? By the way, CHADOX1-S RECOMBINANT is better known as Astra Zeneca.

Vaccinated how?

They’re currently working their way through really old people and other targeted groups like teachers and some industrial workers with the Pfizer and Moderna. There seem to be a fair number of AZ doses arriving, and they’re not recommended for people under 55. So those of us in the 55-65 bracket can sign up at pharmacy branches; I did a couple of weeks back and got an SMS Friday morning.

I felt really baffed out and sore the day after, and just a bit sore today; nothing a bit of Ibuprofen can’t handle. Apparently in Canada we’re on multi-month delay between shots; so it’s not clear when I can go see my Mom, who got her first Pfizer dose on March 19th.

April 2021

It’s been a cold blustery spring but that doesn’t seem to bother the botanicals, especially the fruit trees, some of whom have already peaked.

Flowering fruit tree in Vancovuer’s Riley Park neighborhood

Spot the clothesline.

Nobody I know closely has been struck down by Covid, but people I love are suffering from one ailment or another as I write because that’s how life is. I live in a country with seasons, which means we are all subject to morale-boosting sensory stimuli at this time of year. Grab hold of them! We can all use all the help we can get.

Tiny flowers, April in Vancouver

From here on in, the pictures are courtesy of the strong-willed 40-year-old Pentax 100/F2.8.

What comes next, as the vaccinated proportion of the population grows monotonically (but asymptotically) and then the wave of second doses washes up behind the first’s?

I guess I’m talking about people in the privileged parts of the world, since it looks like the spread of vaccinations will be measurably, irrefutably, deeply racist and joined at the hip with the world’s egregiously awful class structure.

I just want to go to a rock concert.

These rhodos are ready to burst.

Rhododendron blossoms about to open

I’m kind of jaded about daffodils but this one was so pretty against the backdrop that I couldn’t resist.

Daffodil with something pink behind

I hope everyone reading this has something to look forward to so that getting out of bed Monday morning is more than just a chore. Failing that, if you’re in the Northern Hemisphere anyhow, there are incoming flowers. Tell them I said hello.

Mixed flowers in April in Vancouver

The Sacred “Back” Button 10 Apr 2021, 7:00 pm

Younger readers will find it hard to conceive of a time in which every application screen didn’t have a way to “Go Back”. This universal affordance was there, a new thing, in the first Web browser that anyone saw, and pretty soon after that, more or less everything had it. It’s a crucial part of the user experience and, unfortunately, a lot of popular software is doing it imperfectly. Let’s demand perfection.

Why it matters

Nobody anywhere is smart enough to build an application that won’t, in some situations, confuse its users. The Back option removes fear and makes people more willing to explore features, because they know they can always back out. It was one of the reasons why the nascent browsers were so much better than the Visual Basic, X11, and character-based interface dinosaurs that then stomped the earth.

Thus I was delighted, at the advent of Android, that the early phones had physical “back” buttons.

Early Android “G1” phone

The Android “G1 Developer Phone”, from 2008. Reproduced from my Android Diary series, which I flatter myself was influential at the time.

I got so excited I wrote a whole blog piece about it.

Nowadays Android phones don’t have the button, but do offer a universal “Back” gesture and, as an Android developer, you don’t have to do anything special to get sane, user-friendly behavior. I notice that when I use iOS apps, they always provide a back arrow somewhere up in the top left corner; don’t know if that costs developers extra work.


The most important reason I’m listing these problems is to offer a general message: When you’re designing your UX, think hard about the Back affordance! I have seen intelligent people driven to tears when they get stuck somewhere and can’t back out.

People using your software generally have a well-developed expectation of what Back should do at any point in time, and any time you don’t meet that expectation you’ve committed a grievous sin, one should remedy right now.

Problem: The Android Back Stack

Since we started with mobile, let’s talk Android. The “Activities” that make up Android apps naturally form a Back Stack as you follow links from one to the other. It turns out that Android makes it possible to compose and manipulate your stack. One example would be Twitter, which occasionally mails me about people I follow having tweeted something that it thinks might interest me, when I haven’t been on for a while. It’s successful enough that I haven’t stopped it.

When I click on the link, it leaps straight to the tweet in question. But when I hit Back, I don’t return to my email because Twitter has interposed multiple layers of itself on the stack. So it takes several hops to get back to where I followed the link from.

This is whiny, attention-starved, behavior. I’m not saying that Back should always 100% revert to the state immediately before the forward step; I’ve seen stack manipulation be useful. But this isn’t, it’s just pathetic.

Problem: SPA foolery

When, in my browser, I click on something and end up looking at something, and then I’m tired of looking at it and go Back, I should go back to where I started. Yes, I know there’s a convention, when an image pops up, that it’ll go away if you hit ESC. And there’s nothing wrong with that. But Back should work too. The Twitter Web interface does the right thing here when I open up a picture. ESC works, but so does Back.

Feedly doesn’t get this right; if you’re in a feed and click a post, it pops to the front and hitting ESC is the only to make it go away; Back takes you to a previous feed! (Grrrr.) Also zap2it, where I go for TV listings, has the same behavior; Back takes you right out of the listing.

(Zap2it demonstrates another problem. I have my local listings in a mobile browser bookmark on one of my Android screens, which opens up Chrome with the listings. Except for the tab is somehow magically different, if I flip away from Chrome then return to it, the tab is gone. Hmm.)

Problem: Chrome new-tab links

When I’m in Chrome and follow a link that opens a new tab, Back just doesn’t work. If I close the tab, for example with Command-W on Mac, it hops back to the source link. In what universe is this considered a good UX? To make things worse, there are many things that make Chrome forget the connection between the tab in question and its source link, for example taking a look at a third tab. Blecch.

Fortunately, Safari is saner. Well, mostly…

Problem: Safari new-tab links

I’m in Safari and I follow a link from my Gmail tab to wherever and it ends up in a new tab. Then, when I hit Back, there I am back in Gmail, as a side-effect closing the freshly-opened tab. Which is sane, rational, unsurprising behavior. (It’s exactly the same effect you get by closing the tab on Chrome, which is why Chrome should use Back to achieve that effect.)

Except, did I mention that the browser forgets? Safari’s back-linkage memory is encoded in letters of dust inscribed on last month’s cobwebs. More or less any interaction with the browser and it’s No More “Back” For You, Kid.

Especially irritating are pages that intercept scroll-down requests with a bunch of deranged JavaScript fuckery (that I darkly suspect is aimed at optimizing ad exposures) and that my browser interprets as substantive enough to mean “No More Back For You”. I swear I worry about farting too loudly because that might give Safari an excuse to forget, out of sheer prissiness.

Please let ’em back out

Screwing with the path backward through your product is inhumane, stupid, and unnecessary. Don’t be the one that gets in people’s way when they just want to step back.

Long Links 1 Apr 2021, 7:00 pm

Welcome to the monthly “Long Links” post for March 2021, in which I take advantage of my lightly-employed status to curate a list of pointers to good long-form stuff that I have time to savor but you probably don’t, but which you might enjoy one or two of. This month there’s lots of video, a heavier focus on music, and some talk about my former employer.

What with everything else happening in the world, people who are outside of Australia may not have noticed that they had a nasty sex scandal recently. My sympathy to the victims, and my admiration goes out to Australian of the Year Grace Tame's full National Press Club address, which is searing and heart-wrenching. I don’t know much else about what Ms Tame has done, but I’d award her the honor for just the speech, which a lot of people need to listen to. Probably including you; I know I did.

"The ocean takes care of that for us", by Fiona Beaty, thinks elegantly and eloquently about the relationship between oceanfront humans and the ocean. Indigenous nations saw the ocean as a self-sustaining larder, and it could be that again. Assuming we can learn to act like adults in our relationship to the planet we live on.

I’m a music lover and an audio geek, and like most such people, have long lamented the brutal dynamic-range compression applied to popular music with the goal of making sure that it’s never not as loud as the other songs it’s being sequenced with on the radio in car. The New Standard That Killed the Loudness War points out that the music-streaming landscape, a financial wasteland for musicians mind you, is at least friendlier to accuracy in audio. I especially love it when a live band gets into a vamp and the singer says “take it down, now”, and they drift down then surge back. No reason pop recordings shouldn’t use that technique; it’s basic to classical music. Now they can.

What Key is Hey Joe In? (YouTube). By watching this I learned that Hendrix didn’t actually write Hey Joe. It’s not 100% clear who actually did write it, and it’s also unclear what key it’s in. Adam Neely has an unreasonable amount of fun exploring this, and there isn’t a simple answer. The most useful thing you can say is that it’s designed to sound good on a conventionally-tuned guitar. If you’re not literate in music theory you’ll miss some of the finer points, but you might still enjoy this.

One of the pointers I followed out of this video was to a massive blog piece by Ethan Hein entitled Blues tonality, which I’m going to say covers the subject exhaustively but also entertainingly, with lots of cool embedded videos to reinforce his musical points. Some of which aren’t music that any sane person would think of as Blues.

And still more music! My streaming service offered up a number by Emancipator and I found myself thinking “Damn, that’s beautiful, who is it?” Behind the name “Emancipator” is Douglas Appling, a producer and DJ who decided to be a musician too, and am I ever glad he did. The music is mostly pretty smooth and you might be forgiven for thinking “nice chill lightweight stuff” but I think there’s a lot of there there. The DJ influence is pretty plain, but to me, this sounds more like classical music than anything else; carefully composed and sequenced with a lot of attention to highlighting the timbres of the instruments. Mr Appling has put together a band to take the music on the road and I think it’d be a fun show. Here’s a YouTube: Emancipator Ensemble, live in 2018.

I hate to end the musical segment here on a downer, but remember how I mentioned that the streaming landscape is a place where musicians go to starve? Islands in the Stream dives at length into the troubled and massively dysfunctional relationship between music and the music business. This picture has been rendered still darker, of course, by Covid, which has taken musicians off the road, the last place where they can sometimes make a decent buck for offering decent music. At some point a truly enlightened government will introduce a minimum wage for musicians, which means that the price you pay to stream will probably have to go up; Sorry not sorry.

Let’s move over to politics. Like most people, I read Five Thirty Eight for the poll-wrangling and stats, but sometimes they unleash a smart writer in an interesting direction. The smart writer in this case is Perry Bacon, Jr; in The Ideas That Are Reshaping The Democratic Party And America, he itemizes the current progressive consensus in clear and even-handed language. This being 538, there are of course numbers and graphs, and a profusion of links to source data, making this what I would call a scholarly work. Most important phrase, I think: “many of these views are evidence-based — rooted in a lot of data, history and research.” The piece is the first of a two-part series. Next up is Why Attacking ‘Cancel Culture’ And ‘Woke’ People Is Becoming The GOP’s New Political Strategy. In case you hadn’t noticed, the American right, so far this year, has largely abandoned discussing actual policy issues and has retreated into an extended howl of outrage about how “woke” people are trampling free speech via “cancel culture”. Since this is coming from a faction that enjoys being led by Donald Trump, it’s too much te expect integrity or intellectual rigor in their arguments. But from an analytical point of view, who cares? What matters is whether or not the stratagem will work. The evidence on that is, well, mixed.

These days, a lot of politics coverage seems to involve my former employer Amazon. How Amazon’s Anti-Union Consultants Are Trying to Crush the Labor Movement is not trying to convince you of anything, it is simply a tour through America’s anti-unionization establishment and the tools Amazon has been deploying nationwide and in Alabama. They’re spending really a lot of money. What on Earth Is Amazon Doing? is a well-written survey of the company’s late-March social-media offensive, kicking sand in legislators’ faces and pooh-poohing the peeing-in-bottles stories. Amazon is a well-run company but nobody would call this a well-run PR exercise. Is this a well-thought-out eight-dimensional chess move, or did leadership just briefly lose its shit?

The most important Amazon-related piece, I thought, was A Shopper’s Heaven by Charlie Jarvis in Real Life Magazine, which I’ve not previously encountered. It’s building on the same territory that I did in Just Too Efficient (by a wide margin the most radical thing I’ve ever published) — at some point, the relentless pursuit of convenience and efficiency becomes toxic, and we are way way past that point.

OK, enough about Amazon. But let’s beat up on the tech business some more, just for fun this time, with How to Become an Intellectual in Silicon Valley, an exquisitely pissy deconstruction of Bay Aryan thought leaders. Yes, it is indeed mean-spirited, but seriously, those people brought it on themselves.

I recommend John Scalzi’s Teaching “The Classics”, which wonders out loud why high-school students still have Hawthorne and Fitzgerald inflicted on them. There is one faction who feels that those Books By Dead White Guys are essential in crafting a well-rounded human, and others who argue that it’s time to walk away from those monuments to overwriting built on foundations most well-educated people now find morally repugnant. Scalzi finds fresh and entertaining things to say on the subject.

Let’s try to end on a high note. There was a news story about Wikipedia noticing that their content is mined and used by multiple for-profit concerns, using access methods (I’m not going to dignify them with the term “APIs”) that are not designed for purpose. Following on this, they had the idea of building decent APIs to make it convenient, reliable, and efficient to harvest Wikipedia data, and charging for their use, thus generating a revenue stream for long-term support of Wikipedia’s work. This is both promising and perilous — fuck with the Wikipedia editorial community’s loathing for most online business models at your peril. Anyhow, Wikimedia Enterprise/Essay is the best insider’s look at the idea that I’ve run across. [Disclosure: I’ve had a couple of conversations with these people because I’d really like to help.]

And finally, a tribute to one of my personal favorite online nonprofits: The internet is splitting apart. The Internet Archive wants to save it all forever. Just read it. 31 Mar 2021, 7:00 pm

I’m in fast-follow mode here, with more Topfew reportage. Previous chapters (reverse chrono order) here, here, and here. Fortunately I’m not going to need 3500 words this time, but you probably need to have read the most recent chapter for this to make sense. Tl;dr: It’s a whole lot faster now, mostly due to work from Simon Fell. My feeling now is that the code is up against the limits and I’d be surprised if any implementation were noticeably faster. Not saying it won’t happen, just that I’d be surprised. With a retake on the Amdahl’s-law graphics that will please concurrency geeks.

What we did

I got the first PR from Simon remarkably soon after posting Topfew and Amdahl. All the commits are here, but to summarize: Simon knows the Go API landscape better than I do and also spotted lots of opportunities I’d missed to avoid allocating or extending memory. I spotted one place where we could eliminate a few million map[] updates.

Side-trip: map[] details

(This is a little bit complicated but will entertain Gophers.)

The code keeps the counts in a map[string]*uint64. Because the value is a pointer to the count, you really only need to update the map when you find a new key; otherwise you just look up the value and say something like

countP, exists = counts[key]
if exists {
} else {
  // update the map

It’d be reasonable to wonder whether it wouldn’t be simpler to just say:

map[key] = countP // usually a no-op

Except for, Topfew keeps its data in []byte to avoid creating millions of short-lived strings. But unfortunately, Go doesn’t let you key a map with []byte, so when you reference the map you say things like counts[string(keybytes)]. That turns out to be efficient because of this code, which may at first glance appear an egregious hack but is actually a fine piece of pragmatic engineering: Recognizing when a map is being keyed by a stringified byte slice, the compiled code dodges creating the string.

But of course if you’re updating the map, it has to create a string so it can retain something immutable to do hash collision resolution on.

For all those reasons, the if exists code above runs faster than updating the map every time, even when almost all those updates logically amount to no-ops.

Back to concurrency

Bearing in mind that Topfew works by splitting up the input file into segments and farming the occurrence-counting work out to per-segment threads, here’s what we got:

  1. A big reduction in segment-worker CPU by creating fewer strings and pre-allocating slices.

  2. Consequent on this, a big reduction in garbage creation, which matters because garbage collection is necessarily single-threaded to some degree and thus on the “critical path” in Amdahl’s-law terms.

  3. Modest speedups in the (single-threaded, thus important) occurrence-counting code.

But Simon was just getting warmed up. Topfew used to interleave filtering and field manipulation in the segment workers with a batched mutexed call into the single-threaded counter. He changed that so the counting and ranking is done in the segment workers and when they’re all finished they send their output through a channel to an admirably simple-minded merge routine.

Anyhow, here’s a re-take of the Go-vs-Rust typical-times readout from last time:

Go (03/27): 11.01s user 2.18s system  668% cpu 1.973 total
      Rust: 10.85s user 1.42s system 1143% cpu 1.073 total
Go (03/31):  7.39s user 1.54s system 1245% cpu 0.717 total

(Simon’s now a committer on the project.)

Amdahl’s Law in pictures

This is my favorite part.

First, the graph that shows how much parallelism we can get by dividing the file into more and more segments. Which turns out to be pretty good in the case of this particular task, until the parallel work starts jamming up behind whatever proportion is single-threaded.

You are rarely going to see a concurrency graph that’s this linear.

CPU usage as a function of the number of cores requested

Then, the graph that shows how much execution speeds up as we deal the work out to more and more segments.

Elapsed time as a function of the number of cores requested

Which turns out to be a lot up front then a bit more, than none at all, per Amdahl’s Law.

Go thoughts

I like Go for a bunch of reasons, but the most important is that it’s a small simple language that’s easy to write and easy to read. So it bothers me a bit that to squeeze out these pedal-to-the-metal results, Simon and I had to fight against the language a bit and use occasionally non-intuitive techniques.

On the one hand, it’d be great if the language squeezed the best performance out of slightly more naive implementations. (Which by the way Java is pretty good at, after all these years.) On the other hand, it’s not that often that you really need to get the pedal this close to the metal. But command-line utilities applied to Big Data… well, they need to be fast.

Topfew and Amdahl 27 Mar 2021, 7:00 pm

On and off this past year, I’ve been fooling around with a program called Topfew (GitHub link), blogging about it in Topfew fun and More Topfew Fun. I’ve just finished adding a few nifty features and making it much faster; I’m here today first to say what’s new, and then to think out loud about concurrent data processing, Go vs Rust, and Amdahl’s Law, of which I have a really nice graphical representation. Apologies because this is kind of long, but I suspect that most people who are interested in either are interested in both.


What Topfew does is replace the sort | uniq -c | sort -rn | head pipeline that you use to do things like find the most popular API call or API caller by ploughing through a logfile.

When last we spoke…

I had a Topfew implementation in the Go language running fine, then Dirkjan Ochtman implemented it in Rust, and his code ran several times faster than mine, which annoyed me. So I did a bunch more optimizations and claimed to have caught up, but I was wrong, for which I apologize to Dirkjan — I hadn’t pulled the most recent version of his code.

One of the big reasons Dirkjan’s version was faster was that he read the input file in parallel in segments, which is a good way to get data processed faster on modern multi-core processors. Assuming, of course, that your I/O path has good concurrency support, which it might or might not.

[Correction: Thomas Jung writes to tell me he implemented the parallel processing in rust_rs. He wrote an interesting blog piece about it, also comparing Xeon and ARM hardware.]

So I finally got around to implementing that and sure enough, runtimes are way down. But on reasonable benchmarks, the Rust version is still faster. How much? Well, that depends. In any case both are pleasingly fast. I’ll get into the benchmarking details later, and the interesting question of why the Rust runs faster, and whether the difference is practically meaningful. But first…

Is parallel I/O any use?

Processing the file in parallel gets the job done really a lot faster. But I wasn’t convinced that it was even useful. Because on the many occasions when I’ve slogged away trying to extract useful truth from big honkin’ log files, I almost always have to start with a pipeline full of grep and sed calls to zero in on the records I care about before I can start computing the high occurrence counts.

So, suppose I want to look in my Apache logfile to find out which files are being fetched most often by the popular robots, I’d use something like this:

egrep 'googlebot|bingbot|Twitterbot' access_log | \
    awk ' {print $7}' | sort | uniq -c | sort -rn | head

Or, now that I have Topfew:

egrep 'googlebot|bingbot|Twitterbot' access_log | tf -f 7

Which is faster than the sort chain but there’s no chance to parallelize processing standard input. Then the lightbulb went on…

If -f 1 stands in for awk ' { print $1}' and distributes that work out for parallel processing, why shouldn’t I have -g for grep and -v for grep -v and -s for sed?

Topfew by example

To find the IP address that most commonly hits your web site, given an Apache logfile named access_log:

tf -f 1 access_log

Do the same, but exclude high-traffic bots. The -v option has the effect of grep -v.

tf -f 1 -v googlebot -v bingbot (omitting access_log)

The opposite; what files are the bots fetching? As you have probably guessed, -g is like grep.

tf -f 7 -g 'googlebot|bingbot|Twitterbot'

Most popular IP addresses from May 2020.

tf -f 1 -g '\[../May/2020'

Let’s rank the hours of the day by how much request traffic they get.

tf -f 4 -s "\\[[^:]*:" "" -s ':.*$' '' -n 24

So Topfew distributes all that filtering and stream-editing out and runs it in parallel, since it’s all independent, and then pumps it over (mutexed, heavily bufffered) to the (necessarily) single thread that does the top-few counting. All of the above run dramatically faster than their shell-pipeline equivalents. And they weren’t exactly rocket science to build; Go has a perfectly decent regexp library that even has a regexp.ReplaceAll call that does the sed stuff for you.

I found that getting the regular expressions right was tricky, so Topfew also has a --sample option that prints out what amounts to a debug stream showing which records it’s accepting and rejecting, and how the keys are being stream-edited.

Almost ready for prime time

This is now a useful tool, for me anyhow. It’s replaced the shell pipeline that I use to see what’s popular in the blog this week. The version on github right now is pretty well-tested and seems to work fine; if you spend much time doing what at AWS we used to call log-diving, you might want to grab it off GitHub.

In the near future I’m going to use GoReleaser so it’ll be easier to pick up from whatever your usual tool depot is. And until then, I reserve the right to change option names and so on.

On the other hand, Dirkjan may be motivated to expand his Rust version, which would probably be faster. But, as I’m about to argue, the speedup may not be meaningful in production.

Open questions

There are plenty.

  1. Why is Rust faster than Go?

  2. How do you measure performance, anyhow…

  3. … and how do you profile Go?

  4. Shouldn’t you use mmap?

  5. What does Gene Amdahl think about concurrency, and does Topfew agree?

  6. Didn’t you do all this work a dozen years ago?

Which I’ll take out of order.

How to measure performance?

I’m using a 3.2GB file containing 13.3 million lines of Apache logfile, half from 2007 and half from 2020. The 2020 content is interesting because it includes the logs from around my Amazon rage-quit post, which was fetched more than everything else put together for several weeks in a row; so the data is usefully non-uniform.

The thing that makes benchmarking difficult is that this kind of thing is obviously I/O-limited. And after you’ve run the benchmark a few times, the data’s migrated into memory via filesystem caching. My Mac has 32G of RAM so this happens pretty quick.

So what I did was just embrace this by doing a few setup runs before I started measuring anything, until the runtimes stabilized and presumably little to no disk I/O is involved. This means that my results will not replicate your experience when you point Topfew at your own huge logfile which it actually has to read off disk. But the technique does allow me to focus in on, and optimize, the actual compute.

How do you profile Go?

Go comes with a built-in profiler called “pprof”. You may have noticed that the previous sentence does not contain a link, because the current state of pprof documentation is miserable. The overwhelming googlejuice favorite is Profiling Go Programs from the Golang blog in 2011. It tells you lots of useful things, but the first thing you notice is that the pprof output you see in 2021 looks nothing like what that blog describes.

You have to instrument your code to write profile data, which is easy and seems to cause shockingly little runtime slowdown. Then you can get it to provide a call graph either as a PDF file or in-browser via its own built-in HTTP server. I actually prefer the PDF because the Web presentation has hair-trigger pan/zoom response to the point that I have trouble navigating to the part of the graph I want to look at.

While I’m having trouble figuring out what some of the numbers mean, I think the output is saying something that’s useful; you can be the judge a little further in.

Why is Rust faster?

Let’s start by looking at the simplest possible case, scanning the whole log to figure out which URL was retrieved the most. The required argument is the same on both sides: -f 7. Here is output from typical runs of the current Topfew and Dirkjan’s Rust code.

  Go: 11.01s user 2.18s system  668% cpu 1.973 total
Rust: 10.85s user 1.42s system 1143% cpu 1.073 total

The two things that stick out is that Rust is getting better concurrency and using less system time. This Mac has eight two-thread cores, so neither implementation is maxing it out. Let’s use pprof to see what’s happening inside the Go code. BTW if someone wants to look at my pprof output and explain how I’m woefully misusing it, ping me and I’ll send it over.

The profiling run’s numbers: 11.97s user 2.76s system 634% cpu 2.324 total; like I said, profiling Go seems to be pretty cheap. Anyhow, that’s 14.73 seconds of compute between user and system. The PDF of the code graph is too huge to put inline, but here it is if you want a look. I’ll excerpt screenshots. First, here’s one from near the top:

Top of the Go profile output

So, over half the time is in ReadBytes (Go’s equivalent of ReadLine); if you follow that call-chain down, at the bottom is syscall, which consumes 55.36%. I’m not sure if these numbers are elapsed time or compute time and I’m having trouble finding help in the docs.

Moving down to the middle of the call graph:

Near the middle of the Go profile output

It’s complicated, but I think the message is that Go is putting quite a bit of work into memory management and garbage collection. Which isn’t surprising, since this task is a garbage fountain, reading millions of records and keeping hardly any of that data around.

The amount of actual garbage-collection time isn’t that big, but I also wonder how single-threaded it is, because as we’ll see below, that matters a lot.

Finally, down near the bottom of the graph:

Near the bottom of the Go profile output

The meaning of this is not obvious to me, but the file-reading threads use the Lock() and Unlock() calls from Go’s sync.Mutex to mediate access to the occurrence-counting thread. So what are those 2.02s and 1.32sec numbers down at the bottom of a “cpu” graph? Is the implementation spending three and a half seconds implementing mutex?

You may notice that I haven’t mentioned application code. That code, for pulling out the seventh field and tracking the top-occurring keys, seems to contribute less than 2% of the total time reported.

My guesses

Clearly, I need to do more work on making better use of pprof. But based on my initial research, I am left with the suspicions that Rust buffers I/O better (less system time), enjoys the benefits of forcing memory management onto the user, and (maybe) has a more efficient wait/signal primitive. No smoking pistols here.

I’m reminded of an internal argument at AWS involving a bunch of Principal Engineers about which language to use for something, and a Really Smart Person who I respect a lot said “Eh, if you can afford GC latency use Go, and if you can’t, use Rust.”

Shouldn’t you use mmap?

Don’t think so. I tried it on a few different systems and mmap was not noticeably faster than just reading the file. Given the dictum that “disk is the new tape”, I bet modern filesystems are really super-optimized at sweeping sequentially through files, which is what Topfew by definition has to do.

What does Gene Amdahl think about concurrency, and does Topfew agree?

Amdahl’s Law says that for every computing task, some parts can be parallelized and some can’t. So the amount of speedup you can get by cranking the concurrency is limited by that. Suppose that 50% of your job has to be single-threaded: Then even with infinite concurrency, you can never even double the overall speed.

For Topfew, my measurements suggest that the single-threaded part — finding top occurrence counts — is fairly small compared to the task of reading and filtering the data. Here’s a graph of a simple Topfew run with a bunch of different concurrency fan-outs.

Graph of Topfew performance vs core count

Which says: Concurrency helps, but only up to a point. The graph stops at eight because that’s where the runtime stopped decreasing.

Let’s really dig into Amdahl’s Law. We need to increase the compute load. We’ll run that query that focuses on the popular bots. First of all I did it the old-fashioned way:

egrep 'googlebot|bingbot|Twitterbot' test/data/big | bin/tf -f 7
    90.82s user 1.16s system 99% cpu 1:32.48 total

Interestingly, that regexp turns out to be pretty hard work. There was 1:32:48 elapsed, and the egrep user CPU time was 1:31. So the Topfew time vanished in the static. Note that we only used 99% of one CPU. Now let’s parallelize, step by step.

Reported CPU usage vs. number of cores

Look at that! As I increase the number of file segments to be scanned in parallel, the reported CPU usage goes up linearly until you get to about eight (reported: 774%CPU) then starts to fall off gently until it maxes out at about 13½ effective CPUs. Two questions: Why does it start to fall off, and what does the total elapsed time for the job look like?

Elapsed time as a function of number of cores

Paging Dr. Amdahl, please! This is crystal-clear. You can tie up most of the CPUs the box you’re running on has, but eventually your runtime is hard-limited by the part of the problem that’s single-threaded. The reason this example works so well is that the grep-for-bots throws away about 98.5% of the lines in the file, so the top-occurrences counter is doing almost no meaningful work, compared to the heavy lifting by the regexp appliers.

That also explains why the effective-CPU-usage never gets up much past 13; the threads can regexp through the file segments in parallel, but eventually there’ll be more and more waiting for the single-threaded part of the system to catch up.

And exactly what is the single-threaded part of the system? Well, my own payload code that counts occurrences. But Go brings along a pretty considerable runtime that helps most notably with garbage collection but also with I/O and other stuff. Inevitably, some proportion of it is going to have to be single-threaded. I wonder how much of the single-threaded part is application code and how much is Go runtime?

I fantasize a runtime dashboard that has pie charts for each of the 16 CPUs showing how much of their time is going into regexp bashing, how much into occurrence counting, how much into Go runtime, and how much into operating-system support. One can dream.

Update: More evidence

Since writing this, I’ve added a significant optimization. In the (very common) case where there’s a single field being used for top-few counting, I don’t copy any bytes, I just use a sub-slice of the “record” slice. Also, Simon Fell figured out a way to do one less string creation for regexp filtering. Both of these are in the parallelizable part of the program, and neither made a damn bit of difference on elapsed times. At this point, the single-threaded code, be it in Topfew or in the Go runtime, seems to be the critical path.

How many cores should you use?

It turns out that in Go there’s this API called runtime.NumCPU() that returns how many processors Go thinks it’s running on; it returns 16 on my Mac. So by default, Topfew divides the file into that many segments. Which, if you look at the bottom graph above, is suboptimal. It doesn’t worsen the elapsed time, but it does burn a bunch of extra CPU to no useful effect. Topfew has a -w (or --width) option to let you specify how many file segments to process concurrently; maybe you can do better?

I think the best answer is going to depend, not just on how many CPUs you have, but on what kind of CPUs they are, and (maybe more important) what kind of storage you’re reading, how many paths it has into memory, how well its controller interleaves requests, and so on. Not to mention RAM caching strategy and other things I’m not smart enough to know about.

Didn’t you do all this work a dozen years ago?

Well, kind of. Back in the day when I was working for Sun and we were trying to sell the T-series SPARC computers which weren’t that fast but had good memory controllers and loads of of CPU threads, I did a whole bunch of research and blogging on concurrent data processing; see and The Wide Finder Project. Just now I glanced back at those (wow, that was a lot of work!) and to some extent this article revisits that territory. Which is OK by me.


Well, there are a few obvious features you could add to Topfew, for example custom field separators. But at this point I’m more interested in concurrency and Amdahl’s law and so on. I’ve almost inevitably missed a few important things in this fragment and I’m fairly confident the community will correct me.

Looking forward to that.

xmlwf -k 24 Mar 2021, 7:00 pm

What happened was, I needed a small improvement to Expat, probably the most widely-used XML parsing engine on the planet, so I coded it up and sent off a PR and it’s now in release 2.3.0. There’s nothing terribly interesting about the problem or the solution, but it certainly made me think about coding and tooling and so on. (Warning: Of zero interest to anyone who isn’t a professional programmer.)

Back story

As I mentioned last month, I took a little programming job partly as a favor to a friend, of writing a parser to transmute a huge number of antique IBM GML files into XML. It wasn’t terribly hard but there was quite a bit of input variation so I couldn’t be confident unless I checked that every single output file was proper XML (“well-formed”, we XML geeks say).

Fortunately there’s an Expat-based command-line tool called xmlwf that can scan XML files for errors and produce useful human-readable complaints, and it operates at obscene speed. So what I wanted to do was run my parser over a few hundred GML files and then say, essentially, xmlwf * in the output directory.

Which didn’t work because, until very recently, xmlwf would just stop when it encountered the first non-well-formed file. So I added a -k option (“k” for “keep going”) so it could run over a thousand or so files and helpfully complain about the two that were broken.

Lessons from the PR

Most important, I hadn’t realized how great the programming environment is inside Amazon. It’s all git, but there’s no need for branches or PR’s. You make your changes, you commit, you use the tooling to launch a code review, you argue, you make more changes, you (probably) commit --amend (unless you think multiple commits are more instructive for some reason), and this repeats until everyone’s happy and you push into the CI/CD vortex.

Obviously other people might be working on the same stuff so you might have to do a git pull --rebase and there might be pain sorting out the results but that’s what they pay us for. (Right?)

Anyhow, you end up with a nice clean commit sequence in your codebase history and nobody ever has to think about branches or PR’s. (Obviously some larger tasks require branches but you’d be amazed how much you can live without them.)

Finding: Pull requests

Now that I’m out in the real world, it’s How Things Are Done. For good reasons. Doesn’t mean I have to like them. As evidence, I offer How to Rebase a Pull Request. Ewwww.

Finding: Coding tools

The last time I edited actual C code, nobody’d ever heard of Jetbrains and “VS Code” would have sounded like a mainframe thing. I found the back corner of my brain where those memories lived, shook it vigorously, and Emacs fell out. The thing I’m now using to type the text you’re now reading. Oh, yeah; that was then.

C code in Emacs in 2021

It’s 2021. No, really.

It worked fine. I mean, no autocomplete, but there was syntax coloring and indentation and whole cubic centimeters (probably) of brain cells woke up and remembered C. Dear reader, back in the day I wrote hundreds and hundreds of thousands of lines of the stuff, and I guess it doesn’t go away. In fact, the number of syntax errors was pretty well zero because the fingers just did the right thing.

Finding: The Mac as open-source platform

It’s not that great. Expat maintainer Sebastian Pipping quite properly drop-kicked my PR because it had coding-standards violations and a memory leak, revealed by the Travis CI setup. I lazily tried to avoid learning Travis and, with Sebastian’s help, figured out the shell incantations to run the CI. Only on the Mac they only sort of worked, and in particular Clang failed to spot the memory leak.

The best way to deal with this is probably to learn enough Docker (Docker Compose, probably) to make a fake Linux environment. I was well along the path to doing that when I realized I had a real Linux environment, namely, the server sending you the HTML you are now reading.

(Except for it’s a Debian box that couldn’t do the clang-format coding-standards test but that’s OK, my Mac could manage after I used homebrew to install coreutils and moreutils and gnu-sed and various other handsome ecosystem fragments.)

I mean, I got it to go. But if I do it again, I’ll definitely wrestle Docker to the ground first. Which is irritating; this stuff should Just Work on a Mac. Without having a Homebrew dance party.


Well, yeah. We shouldn’t diss it too much, basically every useful online service you interact with is running on it. But after my -k option was added, clang found a memory leak in xmlwf. Which I tracked down and yeah, it was real, but it had also been there before my changes. And it wouldn’t be a problem in normal circumstances, until it suddenly was, and then you’d be unhappy. Which is why, in the fullness of time, most C should be replaced by Go (if you can tolerate garbage-collection latency) and Rust (if you can’t). Won’t happen in my lifetime.


Thanks to Sebastian, who was polite in the face of my repeated out-of-practice cluelessness. And hey, if you need to syntax-check huge numbers of XML files, your life just got a little easier.

Three Million Meters on e-Wheels 16 Mar 2021, 7:00 pm

This is just another round of cheerleading for e-bikes, provoked by my odometer clicking over to three thousand km. Granted, not amazing for twenty months of commuting, but not nothing. For anyone in an even marginally urban situation in reasonable health, if you don’t have one of these, you’re really missing a trick. For earlier raving about this vehicle, see here.

E-Bike by False Creek

March flowers by False Creek.

E-Bike odometer reads 2999

More interesting than 3000, and prime.

Capital cost: Noticeable but much less than anything with a motor.
Fuel cost: Damn close to zero.
Parking cost: Free.
Health cost: Negative.
Carbon loading: Trivial.
Mind-clearing ability: High.
Cargo capacity: Remarkable.
Getting you the hell out of the house during Covid: Beyond price.

There must be some gripes?

Oh yeah, I had a flat. So I bought a new tube and slipped it in and absolutely could not get the big thick fucking tire back on the fucking rim. I had to take it to a bike shop, where I discovered that my wrestling with it had ruined the new inner tube — ten bucks shot to hell — and this skinny little bike-shop woman slipped it on in no time.

Oh, and I ran out of power once and just about gave myself a coronary pumping this klunker up a not-too-steep hill.

These things are rough on chains; I’ve replaced it once and it’s getting ratty again. This is unsurprising, since the bike is so heavy and a low gear plus the e-boost pulls awfully hard, especially if you insist on going fast, which I do.

There’s no place to stash a latte if you pick one up on the way to work.

You’re stretching

Obviously. It’s fast, it’s smooth, it’s fun, it’s green, and it’s on balance cheap. I can’t imagine who wouldn’t want one.

Long Links 1 Mar 2021, 8:00 pm

Welcome to the monthly tour of long-form excellence that I, due to being semi-retired, have the time to read. You probably don’t have that kind of time but one or two of these might brighten your day anyhow.

Katie Mack, an Astrophysics professor, is one of our best science writers; her book The End of Everything is definitely on my to-read list. The American Institute of Physics has a long interview with her which I found interesting as a sort of mini-autobiography of a life in science. Touches on issues of cosmology, communication, and diversity.

Let's Not Dumb Down the History of Computer Science is the transcript of a talk by Donald Knuth, probably the most famous living Computer Scientist. He is a wise man. For the vast majority of people who don’t care in the slightest about the History of Computer Science, there’s still interest here in the consideration of how we ought to communicate about technology, and is it ever OK to do so without diving into the meat of the matter, the details of the problems that practitioners study and ideally solve?

JWZ offers They Live and the secret history of the Mozilla logo. Who is JWZ, you ask? One of the people most responsible for turning the World Wide Web from a tool for science publishing into a giant global engine for culture and business. If you read this you will learn about some colorful and too-little-known corners of geek culture. In particular, anyone who was involved with technology back then will probably find this fascinating.

The geography of cities is three-dimensional, extending far above and below their surface. It is geological and architectural and legal and financial. Covenants, Easements & Wayleaves: The Hidden Urban Interfaces Which Shape London Part 1 studies the subject, diving (literally) deep, building its story around the London Underground.

Now, this is of special interest to anyone who reads books. I became aware of Patricia Lockwood a couple of years ago, mostly due to her engaging, hyper-intense, frequently-off-color Twitter account; previously, her primary vocation was poetry. Separately, I have signed up for news from the London Review of Books, which is really excellent. Last November the LRB suggested I look at something called Malfunctioning Sex Robot by Ms Lockwood, which turned out to be an essay on the collected novels of John Updike. I read a couple of those novels in my younger years — mostly about horny suburban New Englanders if memory serves — but don’t actually care enough, in normal circumstances, to look at Updike lit-crit. But oh my goodness, Lockwood’s piece riveted my attention end-to-end, full of sentences that I thought should be displayed on museum walls and Times Square billboards. A remarkable piece of writing.

Now we have, also from Lockwood in the LRB, I hate Nadia beyond reason, which is about The Lying Life of Adults, a collection from Elena Ferrante, best known for the Neapolitan Quartet, an astonishing extended novel about women and men (mostly women) navigating the second half of the 20th Century, spiraling out from the wrong part of Naples. These are almost unbearably intense and I found myself so deep in their emotional grip that I stopped reading about three-quarters of the way through out of sheer rage at one of the protagonists, who was about to do something I thought was stupid and damaging. Anyhow, turning Lockwood loose on this predictably results in fireworks and, I’d think, significantly increases the likelihood that you’re going to end up reading either more Lockwood or more Ferrante. A sample:

So yeah, to call the Neapolitan Quartet ‘a rich portrait of a friendship’ seems insane, or like something a pod person would say. Lila is a demon of inducement, the cattle prod that drives the mild herd forward, Lenù the definition of homeostasis. The epigraph is from Faust, which I guess according to this formula is a story about two dudes hanging out: only one of them is completely red, because he is the Devil. Like that legend, it begins in a location so specific it can only be referred to as a crossroads, and then moves into the macrocosmos. It is the picture of a person standing on a single point, and inside the long deep dive of a soul into the universe. Of course, it is also a rich portrait of a friendship.

I’m going to have to go back and finish the Quartet now.

So in 1974, there was this movie called Phantom of the Paradise. It follows the Phantom of the Opera canon pretty closely: Disfigured composer, hopeless love, villainous impresario, bloody revenge. Except for The Paradise is a rock nightclub and the whole thing is drenched in rock-n-roll culture. It’s ridiculous. I loved it. Writing in Pitchfork, Phantom of the Paradise Perfectly Captures the Sinister Side of the Music Industry by Nathan Smith puzzlingly adopts the strategy of taking the movie seriously. If you’re old enough to have seen it, you’ll probably like this. For the vast majority of you who aren’t, you might want to watch it, because it’s fun.

Here’s a link to a thing that’s long: The Titles of the PhD Dissertations Defended at the Dzerzhinsky Higher School of the KGB in 1980. Pretty sure I can’t add much to that title.

I’ve loved the music of British bass wizard Jah Wobble for many years but you never seem to read much about him. Bandcamp has a feature, though: Mapping Jah Wobble’s Interdimensional Dub. Worth reading, if only because of the many links to excellent music. Trivia: His real name is John Wardle; “Jah Wobble” comes from an attempt by Sid Vicious, in a typical drunken stupor, to pronounce it. Item: Subcode, a song on Radioaxiom, a dub outing by Wobble and fellow cosmic-bassist Bill Laswell, has the phattest bassline ever recorded by anybody. Finally, here he is live, with Sinéad O'Connor. You need a subwoofer.

It’s been obvious to any thinking person as long as I’ve been an adult that sexual minorities, starting with gay people, are just being who they are. Thus the slogan, from the earliest days of the LGBTQ struggle “Born this way”. Which would suggest a genetic basis. Except for, as The new genomics of sexuality moves us beyond ‘born this way’ discusses, there doesn’t seem to be a “gay gene” or in fact any straightforwardly discoverable genetic basis. Which shouldn’t change anything at the societal level, and is further evidence that what we are is more than what our DNA says; a finding that’s been pretty obvious since the sequencing of the human genome. And yeah, gender is a lot more fluid than even us progressives thought last millennium. This has wider implications for the consideration of experiences that run in regions and families like education and poverty; the piece introduces the term “postgenomics” which I suspect will get lots of traction.

In Amazon’s Great Labor Awakening, the NYT goes deep on the current landscape. Maybe we’re looking at an inflection point? It’s not obvious, but it’s worth close attention.

Hockey Has a Gigantic-Goalie Problem is by Ken Dryden and is a good, fun, read, but you probably need to have played or watched some hockey. I got a bit annoyed by Ken not mentioning the fact that when he himself was probably the world’s best goalie, the fact that he’s a really tall dude was part of the reason. Still, good stuff.

If you care about history, you should want to read books by people who were there to watch it. Which gets very difficult as the history you care about grows more ancient. Still, History Books » Primary Sources helpfully curates a bunch of contemporary narratives of key episodes of history. Shockingly, they left out Xenophon’s Anabasis, which I liked so much I blogged about it seventeen years ago.

What’s the opposite of history? Sci-fi, of course, and if you’re looking for some of that, the Guardian offers The best recent science fiction and fantasy – review roundup. I’ve read none of these! Must fix that.

Sorry to end on a down note, but there is Very Bad Stuff happening in India, mostly hiding behind the many other unfolding global disasters such as the climate emergency, Covid, China’s mass inhumanity, and the rise of the alt-right. No wait, this is a rise-of-the-alt-right story. The political faction currently ruling India is behaving frighteningly like the Nazis in 1930’s, feels sometimes like paragraph by paragraph out of the same playbook. Nobody can say we weren’t warned; the new news here is that Big Tech, seduced by the immense potential of the billion-strong Indian market, seems to be playing along with the ethnofascists: India Targets Climate Activists With the Help of Big Tech . And it’s not just climate activists either.

Until next month!

Meet þ 22 Feb 2021, 8:00 pm

Months into the cold wet Pacific-Northwest Dark Season and our cat, a charming 4-year-old calico, has been bored and fretful. The obvious solution: Get her a kitten! Easier said than done, let me tell ya. But it’s done. The little feline fluffball’s name is Thorn, spelt “þ”. More on that below.

This announcement has been delayed because obviously one must have pictures and little þ is a challenge to photograph. But, finally…

þ the kitten

We’ve wanted a kitten for a long time. When we acquired our current cat, a one-year-old rescue who’d already had kittens, they discouraged us from adopting two at a time: “She’s fierce, and mean to other cats” they said. “If you must get another, wait a couple of years then get a male kitten so she can dominate it.”

But in these plague times, kittens are hard to come by. Lauren haunted the SPCA sites of everywhere within four hours’ drive and came up empty, empty, empty. We were seriously considering dropping thousands for a purebred — we had a Bengal once and she worked out great, but we’d really rather rescue.

Anyhow, a week ago Sunday Lauren ran across this little guy on Kijiji, which is a Canadian Craigslist kind of thing. His owner was quite concerned about the quality of home and invited applicants to write about themselves. We sent a picture of the current cat saying “This will be his big sister” and that seemed to do the trick.

Anna was her name, she had a big apartment overlooking False Creek and a badly broken ankle in a huge cast, with more surgery scheduled. And the kitten needed his next vet visit for shots and so on. So I could see why she had to let him go. Thanks, Anna!

þ the kitten

That name

By tradition, our cats have had typographical names: Bodoni, Marlowe, Rune, and the current calico is Tilde, spelt “~”. We’ve enjoyed the one-character-ness, and were searching for another (“umlaut” was considered) and then came across þ, the letter Thorn, which was common in lots of old northern European languages notably including Old and Middle English, and survives in Icelandic. It’s “th” basically; usually (but not always) the voiceless flavor as in “thirst”, not voiced as in “other”. Arabic, by the way, has two completely separate letters for these two sounds.

Since this is a proper noun it’d be more orthographically correct to use capital Thorn, “Þ”, but we improperly prefer the lower-case þ.

How it’s going

þ’s eleven weeks old as we write, which means tiny, skinny, and silly; all ears and fluff and bounce. He’s so absurdly light that ~, who’s actually a pretty small cat, seems huge, ponderous to pick up.

The pictures here are deceiving because they omit to mention his pencil-thin legs, bulgy belly, and rat-like tail. Which is OK because he’ll grow out of those; his paws are already big so I’m encouraging ~ to establish dominance now before he’s twice her size.

Fortunately, they get along fine. The mission — addressing ~’s Seasonal Affective Disorder — has been accomplished. It took a couple of days and a careful, gradual introduction, but now they play lots every day. He’s got no fear and ambushes ~ with a vigorous spring-and-pounce (he has to pounce up to reach her) and doesn’t seem to mind when she slaps him around for it. Occasionally she’ll pounce back, she’s so much heavier you can hear the air whooshing out of the kitten when she lands. It doesn’t discourage him.

No kitten was ever smart, but þ’s head seems a little less empty than average. He’ll regularly surprise ~ by sneaking around behind something where she’s not looking. You want a smart cat, get a moggie, which þ definitely is.

Those photos

This is about the blackest animal I’ve ever seen, any species; not a single white hair, nose to tail. Our house lighting, outside the kitchen, tends to the soft, and the furnishings towards dark colors. And of course he never stands still for the camera.

But this evening there he was stretched out on the stereo, to be precise on the Benchmark USB DAC, which I leave on and is thus pleasantly warm.

þ the kitten relaxing on a DAC

Thank goodness for modern cameras that do well at ISO3200, for lenses with image stabilization, for the immensely data-rich Fujifilm RAW files, and for Lightroom’s ability to add light gracefully.

By the way, þ is not a digital-audio exclusivist, here he is shortly after discovering that the thing on the record player was going round and round, plotting how to get inside and kill it.

þ the kitten and a record player

Life is a little more interesting and more cheerful around the house. Every little bit helps, this winter.

Sea Island 21 Feb 2021, 8:00 pm

Not the most original name, granted. It’s wedged into the middle of Greater Vancouver’s western oceanfront and is mostly occupied by our airport and its apparatuses. But there are a couple of decent parks, and on a greyish February day they yielded fresh air, smiles, and a harvest of photographs. This particular season of this particular year, we’ll take what we can get.

McDonald Park

It’s wedged in between the airport and the north branch of the mighty Fraser River, whose existence is a big part of the reason Vancouver exists. It’s low-key and old-school.

Pay phone (!) in McDonald Park on Sea Island, Vancouver

Lauren picked it up and listened but there was no dial tone.

Walking along a riverbank smells and feels different than the oceanfront. It wasn’t much of a day but the gloom was relieved by the joy of the many off-leash dogs getting blissfully filthy in the mud and sand.

Reflecting puddle in McDonald Park on Sea Island, Vancouver

I thought the most interesting part was the semi-artificial marsh, deliberately planted and encouraged in an effort to compensate for one or another of the many losses in salmon habitat following on infrastructural improvent.

Marsh sign in McDonald Park on Sea Island, Vancouver Cattails in McDonald Park on Sea Island, Vancouver

These pictures reinforce an argument I’ve made here before and will make again: The desirability of going for a photowalk with a modern cameraphone — they’re all excellent — and a difficult, opinionated, prime lens, in this case my trusty Samyang 135mm f/2.. Neither can take any of the pictures that the other can.


It’s Regional Park (whatever that means) stuffed in behind Vancouver’s main water treatment plant, mostly distinguished by nice views out over the Straight of Georgia (that’s the water between Vancouver and Vancouver Island) (no, Vancouver isn’t on Vancouver Island, deal with it) and the South Iona Jetty, a stone string stretching 4km into the sea; you can walk or bike out and back, which I recommend but we didn’t do today.

People walking on South Iona Jetty, Vancouver

A popular spot on February 21st, 2021.

But for me the main attraction is the views out over the straight to the islands on the other side. Today the tide was at a level that maximized the extent of the tidal flats.

Tidal flats at Iona Park, Vancouver

Behind the flats the sea-grass reminds me irresistably of the coat of Highland cattle.

Dry sea-grass in Iona Park, Vancouver

Some vegetation flourishes in the intertidal zone; I’m sure there’s a branch of botany that understands how plant metabolisms can survive salt water, and maybe there’s something in there we could all learn from.

Tidal flat vegetation at Iona Park, Vancouver.

Let’s put on the long lens and peer across the ocean at the islands.

Looking across the Straight of Georgia from Iona Park, Vancouver

When we drove home, since Sea Island is where the airport is, all of a sudden we were on the road home from the airport, which we’ve taken so, so often over the years but not for a long time, and it felt spooky. Can’t imagine when I’ll fly again.

In these dark days, get the hell outside and soak up some air and light, already. You’ll thank yourself. Take a camera.

Recent Code 14 Feb 2021, 8:00 pm

I’ve been programming and remembering how much fun it is. With nostalgia for Old Data and thoughts on my work at AWS.


What happened was, I was talking to a friend who’s in the middle of a big project; they said “Would you be interested in bashing out a quick parser in Java?”
“Parser for what?” I asked.
I just about sprayed coffee on my computer. “You have got to be kidding. There hasn’t been any GML since the days of mainframes.”
“Exactly. They’re migrating the documents off the mainframe.”
“What documents?”
“High-value deeply-structured stuff. They need to turn it into simple XML, we’ll enrich it later.”

I’m semiretired and suddenly realized I hadn’t done any actual code for many months, so I named a price and they took it. It’s old-school stuff; I mean really old-school; GML is actually a basket of macros for the IBM mainframe Script typesetting system, which I used to write my undergrad CS assignments back in the freaking Seventies.

IBM GML documentation

Old-school it may be, but I’m learning cool new Java things like postmodern switch because IntelliJ keeps sneering at my Java 8 idioms and suggesting the new shiny. And I’d forgotten, really, how nice carving executable abstractions into shape feels. Also parsers are particularly fun.

And here’s a snicker. I realized that the parser needed a config file. So… JSON? YAML? XML? Except that so far, my program had exactly zero dependencies, a single little jar file and away you go; we don’ need no steenkin’ CLASSPATH. But wait… I’d just written a GML parser. So the config file is in GML, yay!

But Should I Code?

Seriously, it’s reasonable to ask that question at this stage of my career. It’s a conversation that arose at both of my last two jobs, Amazon and Google. Should your most senior engineers, the ones with decades of experience and coding triumphs and tragedies under their belts, actually invest their time in grinding out semicolons and unit tests? Or do you get more leverage out of them with mentoring, designing systems and reviewing others’ designs, code reviews, and being the bridge between businesspeople and geeks?

There’s no consensus on the subject that I’m aware of. There are people I deeply respect technically who really believe that coding is a lousy use of their time. But then anyone who’s been in this biz for long has met Architecture Astronauts who can make a hell of a design chart in Omnigraffle but are regularly really wrong.

I’m personally in the senior-engineers-should-code faction and when I was asked to evaluate someone, would always pull up the internal equivalent of the GitHub history page. I wouldn’t expect to see a lot of action there, but I’d get serious misgivings if I saw none. On the other hand, I freely admitted prejudice on the grounds that I personally can’t not code.

Except for I hadn’t for a while. Now I realize how much I missed it.


Now that I’m in talking-about-code mode, I want to mention my most recent excursion when I was at AWS. Coding there is terrific, with very decent dependency-management and code-review tools. And, most important, there’s a good chance that your code will end up being used by hundreds of thousands of customers or process millions of requests per second or both. Those things will turn any geek’s crank.

I didn’t code a lot there. One little package in Snowmobile. Some bits and pieces in Step Functions; I love that service.

But I was fiddling with code in EventBridge from the month I joined (December 2014) to the last days before my exit. In particular, the stuff linked to from Events and Event Patterns in EventBridge. Words can hardly describe the intensity and fun involved in building this thing, and the thrill as customers piled on board and the flow of events through the code became mind-boggling.

The software lets you write rules and present events and have it tell you which rules matched. Simple enough, logically. It has an unusual but good performance curve and an API people seem to like. I can’t go into how it works; there’s a patent filing you could track down but that’s written in patent-ese not English so it wouldn’t help.

Other teams started picking it up and suddenly I had collaborators. There was this one feature request that I was convinced was flatly impossible until this guy I’d never heard of just made it work. He and I were chief co-authors from that point for the next several years. I miss him.

My last couple of years at AWS I was, in between all my other work, regularly chiselling away at this code. It wasn’t the best part of my job, but I liked it. At one point it became clear that AWS was serious about upping its open-source game. So I floated a proposal that we open-source my baby. That ball was still in play when I left but I’m not holding my breath. I still had lots of what-to-do-next ideas and working on it would be a great semiretirement hobby.


If you used to like to code but don’t do it any more, I suggest you see if you still do.

Jassy Talking Points 5 Feb 2021, 8:00 pm

Following on Tuesday’s big succession announcement at Amazon, I was apparently the only human who’d been in a room with Andy Jassy more than once in recent years and was willing to talk to media. By about the fifth conversation, my talking points were stubs because the points wore off, leaving a well-polished gleam. So I might as well share them directly. If you’ve read any of the other articles this may sound familiar.

I should be clear that I’m not exactly close to Andy Jassy —  I’ve only ever been in the room with him at decision meetings and annual planning reviews; maybe a dozen times in my 5½ years there. Second, I’m not going to tell any secrets.

I’ll use Q&A format. Every one of these is one I got from one or more journalists.

Why Andy?

In 2006, AWS didn’t exist. In 2021 its annual run rate is $50B. What else do you need to know?

The obvious choices for CEO were Andy Jassy and Jeff Wilke. Without inside information, I suspect that the decision was made a few months back and explains Wilke’s exit.

I think it’s the right call. Andy, in my opinion, is an outstanding executive. AWS was the best-managed place I worked in my 40-year career, including places where I was the CEO.

He’ll be less aggressive than Jeff, right?

I got this question a few times and it surprised me, because I don’t think so at all.

I remember one annual planning meeting where I was a senior member of a (large) AWS product group; we presented our six-pager. The usual sort of talk ensued, Andy’s team challenging us on this technical or that business issue. As I’d come to expect in document-driven discussions, the quality was excellent. Eventually Andy spoke up. He had a few things to say but this is what I remember: “Are you guys thinking big enough? Could you go faster? If we gave you double the resources, what could you do?”

That’s just anecdote. But here’s a number. It’s Amazon Web Services, right? How many services, then? Somewhere around 200 at this point and the rate of new-service announcements isn’t slowing down. There’s actually a faction among customers and analysts who argue that there are too many; that it’s hard for customers to understand and choose. Except for, under Andy’s leadership the strategy has been simple: Identify every IT problem an organization can have and offer a service that solves it. Are you really going to disagree in the face of that zero-to-$50B trajectory? Aggressive enough?

What’s Jeff’s legacy?

Jeff founded Amazon in 1994 and now it’s arguably the world’s most powerful company. Investors think it’s worth $1.7T and it employs over a million people. Whether you love or loathe capitalism, you have to be impressed at those numbers.

On the other hand, as recently as ten years ago the Big-Tech companies were admired and their leaders hero-worshipped. Today, a substantial proportion of the population is disaffected with Big Tech in general and Amazon in particular. Even people who buy a lot of stuff from Amazon tell me they feel bad about it. Many feel raw fear. Jeff Bezos is definitely part of that problem.

So his legacy has to include both the corporate success and the sectoral disaffection.

Isn’t it cool that Jeff is free to explore space?

Spare me. I’m as space-crazed as the next geeky nerd but in the near-to-medium future we need to be paying close attention to this planet’s problems if we want our children to have a place to live. In terms of economics and health and the environment, space is a distraction. I was super happy to see Jeff’s letter feature the Climate Pledge prominently.

By the way, anyone noticed Jeff’s ex-wife is kicking his butt in the (difficult) practice of giving money away constructively? I hope he catches up.

What’s Andy like?

OK, this is boring, but: customer-obsessed.

Data point: Before the first Andy meeting where I’d helped write the six-pager and was expected to make the case, I got advice: “Make sure anything you say is backed by customer data.” Fortunately we’d been running a popular beta on the new service under discussion, so I had loads of anecdotes from household-name customers. So when a question came up I could say things like “Well, CustomerA says the big upside is X but CustomerB says we need to beef up Y.” The advice was good: My recollection is that the decisions went the way we’d wanted them to.

Data point: It’s the middle of the night and some AWS service is having a nasty outage; maybe a big rainstorm took out a bridge with three telcos’ fibre on it. There’ll be a late-night communication chain on what we can say to customers and when we can say it. Andy will be on that chain and the service team’s representative (back then, sometimes me) better provide regular updates.

I could offer more if I could tell secrets. But customer focus is genuinely the first thing that comes to mind.

Now here’s something to watch out for. AWS claims a million-plus customers and, while I don’t know the numbers, it’d be reasonable to think that a significant chunk of the revenue comes from a smaller number of big customers. Thus, anyone who’s doing significant business has a few AWS people they’ve gotten to know pretty well and whose job is to understand their problems.

Amazon has over 150 million. Not just customers I mean, but Prime members. So I think Andy’s customer-obsession degree of difficulty is about to go way up.

Will Andy change the Amazon culture?

I doubt it. In my opinion, the vision of Jeff as singular day-to-day supergenius mastermind is just wrong. Nobody can do that at Amazon’s scale. The real achievement is building and sustaining highly effective culture and process. Around hiring and promotion, around product management, around reporting and decision-making. You can’t grow and innovate at that scale unless you can delegate strategy and tactics to lightly-supervised groups with high confidence that they’ll be right a lot.

From the point of view of Amazon’s leadership and its investors, its culture is working just fine. Why change anything? But…

Is Amazon going to become more humane?

I doubt it. There just isn’t a way to employ a million people and sell to a hundred million others others and offer a human touch.

So I don’t expect Amazon to stop bullying partners or having high injury rates at warehouses or ferociously resisting unionization.

Unless, of course, they are forced to do these things by a combination of legislation, regulation, and litigation. Which leads to:

What are the big challenges facing Amazon?

Now that Andy’s been promoted, he’s got a new responsibility: Testifying to Congress. It’s not going to be fun. And for my money, that’s going to be the biggest change to the landscape that Amazon plays on.

Quoting from the Leadership Principles: “As we do new things, we accept that we may be misunderstood for long periods of time.” Is it just me or does that sound like a symptom of extreme arrogance? “We’re so much smarter than everyone else that normal people can’t even understand why what we’re doing is smart.”

Now, this might be plausible when you’re a scrappy Seattle startup doing things nobody had ever previously thought of. When you’re the world’s most visible and most powerful company, you neither can nor should want to be misunderstood at scale. Maybe edit that LP a bit?

I expect Amazon to experience severe friction on multiple legal fronts. First, anti-monopoly. Note that 2020’s upsurge of antitrust litigation was not only bipartisan but in some cases Republican-led.

Regular readers here know that I’m not neutral at all about this: I enthusiastically support aggressive anti-monopoly action against not just Amazon and not just the Big-tech titans and not just in the USA, but across the economy and across the globe. I wouldn’t start with Amazon, if I were running the show, but I’d get there pretty quick.

(I’m pretty sure promoting Andy is a smart move and one he’s earned. But I’m unhappy because it decreases the likelihood that Amazon will spin off AWS voluntarily, which I think would be unambiguously good.)

A second legal front is probably going to fall out of the continuing uproar among sellers making their living on the Amazon platform. Elizabeth Warren argued powerfully in March 2019 that certain large tech businesses need to be designated as “platform utilities” and strictly regulated, most obviously by forbidding companies from both operating a marketplace and selling on it.

Finally, of course, we can expect the labor landscape to change. America lags the rest of the rich world shamefully in the imbalance of power and wealth between Capital and Labor; shifting this balance has to be pretty high on any progressive agenda.

How this goes depends on the politics playing out in Washington and then the next couple of election cycles, but my perception is that the Overton Window has moved and that the Covid interregnum may mark the high point of the fifty years of regressive politics kicked off by the Reagan-Thatcher neoliberal consensus of the Seventies. Definitely watch this space.

Hey Andy, here’s some sincere advice: Get to know Congresswoman Jayapal soonest, and give a careful listen to what she says. She’s really smart and you might agree on more than you suspect.

Andy Jassy is a terrific executive and I respect him a whole lot. It’ll be fascinating to watch Amazon navigate this new landscape.

Long Links 1 Feb 2021, 8:00 pm

Welcome to the Long Links look-back at January 2021. Once again I assemble long-form pieces that I’m fortunate enough to have time for due to my lightly-employed condition. Probably few have time (or inclination) to plow through all this stuff, but one or two might reward your time.

I can’t imagine anyone reasonably literate not having enjoyed reading John Le Carré, and I really enjoyed My Dinners with le Carré. It seems like he was a very decent and very impressive human being. I neglected to read a few of his later books but I’ll go back and do so. My favorite under-appreciated Le Carre is The Little Drummer Girl; what’s yours?

For a variety of reasons I’ve been studying the larger issues around content moderation. Mike Masnick offers Masnick's Impossibility Theorem: Content Moderation At Scale Is Impossible To Do Well. It isn’t that long, but it’s important and his argument is powerful. He’s not arguing for giving up on moderation, either.

State of CSS Report 2020, by Raphaël Benitte and Sacha Greif, is delightful. I have a vexed relationship with CSS. I admire what it can do, and I’d like to do some of those things in the space that you’re now reading, but damn it’s hard; the days when you could View-Source and dope it out are way past us. Maybe one of these years I’ll go to school full-time for a few months, which I think you need to do these days to get on top of CSS.

Just possibly you enjoy sci-fi and haven’t read any Gene Wolfe. Oh my goodness; stop whatever it is you’re doing and go pick up The Devil in a Forest or The Fifth Head of Cerberus or The Book of the New Sun. [Those are Amazon Affiliate links, careful.] Gene Wolfe Turned Science Fiction Into High Art gives those of us who already know about Mr Wolfe a quick tour through his life, pretty prosaic to be honest, and his slow, slow path to the very top of his genre’s heap. I hadn’t known that the Fifth Head was a very early work — I’ve been known to argue that it may be the finest sci-fi novel ever written.

George Orwell famously wrote “The object of power is power” and he was right, but money is a pretty powerful object too. To the extent that all around the world people exercise power to enrich themselves in ways entirely contemptuous of legality and morality. Thus Countering Global Kleptocracy: A New US Strategy for Fighting Authoritarian Corruption is highly relevant. It offers specific recommendations for the incoming US administration. I think this is important because not only is crushing corruption good for the planet’s civic health, but it’s good politics too. Some (not all) of the targets are soft, but all are worthy of determined attack.

Look, at this point in history I’m not going to defend my habit of watching the NFL. I tried to give it up but fell off the wagon, partly because I so loved playing football in my youth. At certain points, certain humans can do things that are beautiful and shocking and just unimaginably excellent, and just at this moment Kansas City quarterback Patrick Mahomes is such a person. What Makes Patrick Mahomes So Great is an extended statistics-backed appreciation of why.

Another thing I’m not going to particularly defend is my audiophilia. One of the underappreciated benefits of being a devotee of good sound is reading the High-End HiFi magazines, notably Stereophile and The Absolute Sound. Yes, the front part of the magazine enthuses over overpriced shiny boxes, but you know what? It turns out audiophiles tend to have excellent taste in music, and I’ve discovered many of my favorites in the back pages of those magazines. Every year, Stereophile offers a recommendation roll-up: Records to Die For 2021 is the most recent. A must-read for music lovers.

You hear it on the Internet: “Pictures or it didn’t happen!”. How about “Pictures and it didn’t happen!” Because images are just a bunch of bits which can be and regularly are faked. On top of which, certain “enhancement” techniques routinely applied by professionals come just this side (maybe) of fakery. Anyhow, using technology to automate a reliable provenance chain for images has to be a good thing. Adobe and certain partners have been working on this for a while and results are starting to decloak: The Content Authenticity Initiative shows its first real-world samples of CAI-attributed images. Check it out if you care about pictures and also the truth.

Here is a treat. I’m With Her is Sara Watkins (violin, guitar and ukulele), Sarah Jarosz (banjo, mandolin, octave mandolin and guitar), and Aoife O'Donovan (keyboard and guitar), all of whom are fine, successful musicians. I’m With Her — Live at House of Blues (from May 2019) is eighty minutes of music not one second of which is dispensable. There were half a dozen occasions when I found myself tearing up or closing my eyes to listen harder. The songs, the singing, and the playing are beyond awesome. A few of the mandolin/fiddle breaks may overheat your speakers. I can’t wait for there to be concerts again.

'Our souls are dead': how I survived a Chinese 're-education' camp for Uighurs is by Gulbahar Haitiwaji. Never forget this is happening. Never excuse any official representative or unofficial lackey of China’s barbaric regime. There can’t be enough reminders so here’s another. Read it and be angry.

I’ve been blogging since 2003 and the single piece I’m proudest of is Just Too Efficient from May last year. Efficiency is the holy grail at all the institutions in the world where they train anyone to manage anything, and it’s just gone too far. Jeremy Schmall offers From Dayton, Ohio to Donald Trump Why our obsession with efficiency is incompatible with democracy, which is mining the same ground as my piece, but takes a very different and more personal angle. When every store is a mall and every vendor is a global monopoly we’re living in a bad place; getting there involves a successive reduction in the number of choices we have available as citizens. And leads, as Schmall argues, to consequences which include Donald Trump.

Sorry, I can’t let a month go by without taking a whack at the slow-motion catastrophe that is cryptocurrency in 2021. I’m a little nervous about linking to The Bit Short: Inside Crypto’s Doomsday Machine because its author chooses to stay anonymous. But the reportage smells like truth to me.

The title Once & Future Bride of the Sea refers, not to what you think it might, but to Jaffa, an interesting city on Israel’s Mediterranean coastline. It’s on YouTube, a half-hour walking tour offered by Sami Abou Shehadeh, a Palestinian Israeli citizen and Jaffa city councillor, in Hebrew (with subtitles). Yes, of course it’s drenched in the Israeli/Palestinian trail of tears; how could it be otherwise? But Sami’s a charming host and it’s a scenic place and I think might expand many minds.

Of all the legal hammers that need to be applied to reform Big Tech, I think the highest priority should be given to beating up Google and Facebook to unfuck the advertising business and give 21st-century journalism a fighting chance. Not convinced? Read Behind a Secret Deal Between Google and Facebook and I suspect you’ll understand why I’m so dogged on this subject.

I bet you didn’t expect to find a lengthy release from the Trump White House in here. I refer to Statement from the Press Secretary Regarding Executive Grants of Clemency, which describes all of the last-minute pardons granted by That Asswipe as he shuffled off to Florida. I don’t know why I popped it open but I found it oddly compelling and read the whole thing. At some level, it casts a useful light on American dysfunction from a novel direction.

Researchers release massive Twitter dataset of voter fraud claims is not actually a long piece, but it’s about a massive database. I think the fact that the Republicans were able to mount and sustain an entirely-false legend that the 2020 election was stolen, and get literally tens of millions of Americans to believe it, illuminates a central problem of modern civic society: How do you promote truth and fight falsehood? Understanding how this happened is important. Which we don’t yet, but here’s the data you need to work on the problem.

I’m not sure why this was in The Financial Times. In California, a journey to the end of the road is a lyrical, beautifully-photographed visit to California’s Salton Sea, which most would regard as something of a hellscape. Now it’s a place where you can live for free and not starve. If you don’t mind the landscape. Compelling.

There’s this guy called Will Wilkinson who’s a fine writer and tries to be an American Centrist, a tough row to hoe these days. He’s politically well to my right, but I tend to read whatever he writes because it’s always smart and good. Anyhow, he used to be a staffer at the Niskanen Center, which tries to be institutionally centrist. Until, last month, he tweeted “If Biden really wanted unity he’d lynch Mike Pence.” Which is cruel and tasteless and funny and got him fired. It didn’t take him long to launch his own Substack (of course), with Undefined Cancel Game. Did I mention good smart writing? This is that. Very good and very smart.

Another of my odd habits, of which not in the slightest bit ashamed, is a weakness for surf-guitar instrumentals. I’m not the only one worshipping this flame, thank goodness; check out Top 10 Modern Bands Keeping Surf Rock Alive And Well In 2019. This is happy, happy music and we can all use that.

Speaking of things that are good and happy, let’s close with The Women of Wikipedia Are Writing Themselves Into History. I’m in awe of these woman and of what they’ve accomplished. They deserve everyone’s support.

Late Plague Winter 24 Jan 2021, 8:00 pm

The first time I posted the first crocuses on this blog was in 2003. And then a lot of times since then. It’s therapeutic; I may be a Canadian and happy about it, but damn I hate winter. Here at Canada’s bottom left corner we don’t get the brutal cold but also we don’t get much winter light, so sun on petals is a morale-booster. And do we ever need one this particular winter.

January crocus

The little guy above is awfully cute, sneaking into the sun. The flowers are early this year; they usually don’t show till sometime in February. Welcome!

What a shitty winter. At least that asshole isn’t in my face all over the Internet. I look forward to never writing and rarely reading about him again. The problem remains of how to run a democracy when one of its political wings is entirely oblivious to both the value of truth and the essentially damaging nature of falsehood; good luck with that.

Then there’s Covid. I guess we’ve started to win, but it’s a long haul. It looks like, if the vaccine supplies arrive on schedule, I should get my first dose in May or so. Like everyone else, I’m feeling stir-crazy and music-starved and grumpy at my fellow citizens and generally inclined to be less charitable than I should be.

Fortunately spring beckons, albeit distantly

January crosuses

Speaking of being music-starved, it looks like, with any luck, there might be concerts this fall. Given that almost every musician in the world is both broke and stir-crazy, I expect there to be a lot of concerts. Let’s make a resolution to watch the announcements and open our wallets and get up off our sofas and into real rooms with real music played by real people before the year’s over, a lot more than once.

This last crocus pic has a story. I grabbed the camera, screwed on the trusty old 35mm/F1.4, and got down on my knees. Only something was wrong, it wouldn’t let me change the aperture, I worried that my lovely old lens might be wearing out, then (after I’d taken a few shots anyhow) noticed that I’d accidentally tripped the Fujifilm’s “Auto Mode”, which is very opinionated. It does not care to hear your opinion about aperture or recording mode, and captures JPG-only, no RAW for you.

January crocuses, the camera’s version not mine.

That’s not a picture I’d take, but I’m not gonna say the camera got it all wrong.

Hang in, we’ll get through this; most of us anyhow.

When You Know 20 Jan 2021, 8:00 pm

I’m a person who knows a lot about how computers and software work, is generally curious, and reads fast. I’ve been wrong about lots of things over the years. But there have a been a few times when a combination of technology-literacy and just paying attention to the world have made me 100% sure that I was seeing something coming that many others weren’t. Here are a few of those stories. The reason I’m telling them is that I’m in another of those moments, seeing something obvious that not enough other people have, and I want to offer credentials before I share it. Also, some of the stories are entertaining.

Unix in the Eighties

As an undergrad, I used Unix V6 on a PDP-11/34 back in 1979, but when I graduated the dinosaurs that stomped the earth had labels like “VMS” and “MVS” painted on the side, and mega millions of investment and marketing behind them. (If you don’t know what those labels stood for, that’s OK). But from time to time I got my hands on a Unix command line and kept thinking “This is just better” and then one time I wrote code that did networking and understood the power of fork and exec.

So I started going around telling all these IT management types that Unix was better than what they were using, and got blank looks, and when I got really insistent was eventually told to shut up. At that point I was young enough that I was convinced that, well maybe, I was just crazy, after all these were guys who’d been doing IT for decades. The rest is history.

Java in the Nineties

I was a C and FORTRAN guy, then in 1996 I was helping design XML and told myself that it’d be cool if XML had a working XML parser. In 1997 Java was The New Hotness so I decided to learn it and use it. I actually used Microsoft’s Visual J++ which, for the time, was pretty great.

Eventually I published Lark, the world’s first XML parser, then a couple of years later gave up on it because Microsoft and Sun both had their own parsers and who was I to compete with titans? I regret that because Lark was faster and had a nice API and if I’d maintained it, it’d probably be popular to this day.

By the time I’d done Lark, I’d seen the advantages of a programming language that came with a good standard library, ran on a VM, had garbage collection, and had a reasonably clean, minimal design.

So I started telling everyone I knew that they should do their next project in Java. Everyone I knew blew me off and said that Java was too slow compared to C++, had a primitive GUI compared to Visual Basic, didn’t have government buy-in compared to Ada, didn’t have a mainframe story compared to PL/1, or didn’t let you go fast and loose compared to Perl.

By this time my ego had expanded and I didn’t shut up and I think I may have actually changed a few people’s minds.

The Web in the Nineties

This was the one that was most irritating. By the late Nineties, the Web had expanded out of the geek-enthusiast space and Open Text, a company I co-founded, had done a nice IPO based on Web search and a Web document-management UI. I remember like yesterday a presentation at one of the early Web meetups, an engineering lead for a (then) big computer company. She said “This is so great. Our interfaces used to have to be full of sliders and dials and widgets or people would say we were amateurs. But now with the Web, there’s so much less you can do, but the important things are easier, and that’s what people want!” She was right.

Between 1996 and 1999 I was an indie consultant, trading off my ill-gotten fame as a Web Search pioneer and XML co-inventor. Everyone who hired me got told that they should damn well invest in Web delivery and stop investing in anything else. I heard “But native GUI is a much richer environment” and “The network will never be fast enough” and “Yeah, that stuff is just toys for kids, we’re serious Enterprise Analysts here.”

The change came, as they always do, maddeningly slow then frighteningly fast. My recollection is that the advent of was very influential. Even the most non-technical business person could glance at it, realize “All I have to do paste in a tracking number, didn’t have to install any software, and there’s my answer.”

WS-* in the Naughties

Once the Web had become everyone’s favorite GUI, people started to notice that it was pretty easy to set up a network-facing API using HTTP. (And at that time XML, which made things harder, but it was still pretty easy.) Way easier than the incumbent technologies like CORBA and DCOM and so on. Meanwhile, Roy Fielding was working on his doctoral dissertation which established the key concepts of REST.

For some reason that I’ve never understood, IBM and Microsoft chose this time to launch a land-grab. In a really annoying and unprincipled way, they rallied behind the banner of “XML Web Services” and jammed a huge number of mammoth, stupidly-complex “WS-*” specifications through compliant standards organizations. Even the most foundational of these, for example WSDL, was deeply broken.

A few people (including me) thought the Web didn’t need WS-layering, and that what we would come to call REST was astonishingly simpler and already known to work. We recoiled in horror and became avid anti-WS-* campaigners. Of all the things in my life that I’ve found myself against, WS-* was the softest target, because it basically just didn’t work very well. I remember with glee publishing blog pieces like WS-Pagecount and (especially) WS-Stardate 2005.10. Because the best way to attack a soft target is to make people laugh at it.

WS-* actually survives, last time I checked, as Microsoft WCF. But nobody cares.

Android in 2010

When the iPhone launched in 2007, I was working at Sun, i.e. Java World Headquarters. Like everyone else, I was captivated by the notion of a pocketable general-purpose computer with a built-in GPS and phone and camera and so on. Unlike most, I was appalled by the App Store axiom that I could write code for the thing, but I couldn’t publish it unless Apple said I could. Also, I was (and remain) not crazy about Objective C.

So when Android arrived on the scene, it got my attention. Among other things, you programmed it with what felt like pretty ordinary mainstream Java, which I and a lot of people already knew. And if I didn’t want to use the Google store, I could post my app on my website and anyone could use it.

So in 2008, I wrote the Android Diary series, describing my experiences in getting an Android phone and writing my first app, which was so much fun. I discovered that the development environment, while immature, was basically clean and well-designed, and accessible instantly to anyone who knew Java.

People laughed at me, saying the iPhones were faster (true), had better UI design (true), and were uncatchably-far ahead, measured by unit sales (not true). Eventually Oracle bought Sun and I left and I got a nice job in the Android group at Google. When I joined, there were roughly ten thousand Android devices being sold per day. When I left, it was over a million.

What I see now: Run screaming from Bitcoin

It is completely unambiguously obvious to me that Bitcoin, a brilliant achievement technically, is functioning as a Ponzi scheme, siphoning money from the pockets of rubes and into those of exchange insiders and China-based miners. I’m less alone in this position than I was in some of those others, I think a high proportion of tech insiders know perfectly well that this is a looming financial disaster.

I am not going to re-iterate all the arguments as to why this is the case. If you want to find out, follow Amy Castor.

OK, let me add one additional argument for why Bitcoin is not and can never be “real” money. You know what real money is? Money you can use to pay your taxes. The USA, in 2018, had about 140 million taxpayers. Suppose 10% of them wanted to use Bitcoin to pay their taxes. Let’s say the global Bitcoin network can process ten transactions per second (it can’t, it’s slower than that). By my arithmetic, at 10/second it would take the whole network, running flat out, not doing anything else, over five months to process those payments and refunds. This is just Federal Income Tax.

Don’t get into Bitcoin. If you’re in, get your money out while you still can.

Trust me on this.

Long Links 1 Jan 2021, 8:00 pm

Happy new year! Welcome to the first Long Links of 2021; this is a monthly curation of long-form pieces that I, due to being semiretired, have time to read. Probably, most people reading this have less time, but perhaps one or two will add value even for a busy person.

The last month of the year is an invitation to best-of pieces. Music is probably my chief recreation so I’m a sucker for this kind of piece. In The New Yorker, Amanda Petrusich’s The Best Music of 2020 showed me a couple of musical paths I hadn’t been aware of. Is it weird that every single best-music-of-the-year piece featured 79-year-old Bob Dylan’s Rough and Rowdy Ways? Then over at Discogs there’s The Most Popular Live Albums of 2020; close to my heart, since most of my very favorite recordings over the years have been live. Of particular note: The “Saucerful of Secrets” show that Pink Floyd drummer Nick Mason took on the road in 2019 and I enjoyed.

Erstwhile NFL quarterback Colin Kaepernick is now a publisher. He labels his flavor of activism “Abolitionism” and it’s strong stuff. This is a big collection of big pieces and even I haven’t made time for all of it. Abolition for the People is a good place to start.

The EU’s Digital Markets Act: There Is A Lot To Like, but Room for Improvement by Cory Doctorow and Christoph Schmon, is what it says on the label. Full of useful detail on what they’re up to over there. I read through the list of proposed reforms, reacting with “well, of course” to most, and then as the list got longer and longer and longer I realized how screwed-up the Internet economy is right now and how urgent the case is for radical reform.

Speaking of screwed-up, So *that's* how Breitbart is still making money, on the BRANDED substack, offers a tour of the grimy underbelly of how Internet ads are sold and how unscrupulous operators can arrange to sell ads while making it very hard for buyers to ascertain where their money is going. More reasons, were any needed, to think that the system needs blowing up and rebuilding from scratch.

In 1982, I wrote a hundred thousand lines of COBOL code. That’s not nearly as impressive as it sounds. First of all, COBOL is verbose. Second, this was the I/O module of a big airport-automation system with huge volumes of cut-n-paste code to provide a file abstraction over lots of different sources. In the rearview, I don’t even hate COBOL; there are things it’s good at. The Code That Controls Your Money, by Clive Thompson, is a highly readable tour through the history and culture of the trusty old language. The fact is that COBOL is still the incumbent technology in much of the finance sector, and Thompson has lots of smart things to say about the business effects, some of which surprised me.

Moving from old programming languages to a new one (Swift), here is a long entertaining Twitter thread, which begins “Alright folks, gather round and let me tell you the story of (almost) the biggest engineering disaster I’ve ever had the misfortune of being involved in. It’s a tale of politics, architecture and the sunk cost fallacy [I’m drinking an Aberlour Cask Strength Single Malt Scotch].” It’s a highly instructive tale of how Uber got themselves into big app-development crisis and then back out. Since I generally loathe the whole practice of dressing up labor-arbitrage operations as technology plays, and specifically loathe Uber, I think society might have been better served if they’d failed. But you have to have sympathy with the dev team.

The ergodicity problem in economics by Ole Peters, in Nature Physics, forsooth, is an important piece of work. It argues that current economics math is mostly broken because it makes entirely unjustified (and unjustifiable) assumptions of equilibrium. I did not take the time to stop and convince myself that I understood each equation, nor do I think I fully understand Peters’ alternative approach, but I found his criticism of the status quo compelling.

At the Columbia Journalism Review, Why Democrats lose on social media while Republicans lie and win big is subtitled By dominating Facebook, the world’s largest media platform, the GOP demonizes the Green New Deal. The Green New Deal, which should actually be a pretty easy program to sell, politically, now polls horribly when its name is mentioned. This piece dives into why that is and ends with a plea for progressives to focus on viral storytelling techniques. As a blogger, how can I disagree?

Let’s take an astrophysical excursion. Regular readers have probably noticed that I’m fascinated by the Dark-Matter controversy. Among the most visible of the skeptics, and definitely among the most eloquent, is Stacy McGaugh, a prof at Case Western. Big Trouble in a Deep Void, on McGaugh’s blog but by three guest authors, takes an eye-opening look at the large-scale structure of the universe — there’s good evidence that our galaxy inhabits a billion-light-year-across volume with much lower matter density than the universe’s average; thus the “Deep Void”. The standard Astrophysical model, ΛCDM, says that can’t happen. The discussion quickly gravitates (snicker) to the Dark-Matter-vs-MOND controversy. I think most lovers of science have to enjoy situations where the best available theories totally don’t explain the best available observations because that means discoveries are there to be made. I enjoyed the hell out of this one.

Noah Smith’s Techno-optimism for the 2020s has been pretty widely linked-to and isn’t that long so you may already have read it. I hadn’t thought about the larger-scale subject but the essay makes some strong points. In particular, and I quote, “But now, for the first time since the 60s, technology is going to make energy cheaper. ”

Ladies, gentlemen, and others, Section 230 is very important. As I write this, repealing it has become a Republican priority because it makes much of Big Tech possible and conservatives hate technology because it occasionally reflects progressive values. I’m not going to explain what Section 230 is or what it does, because Sue Holpern does so very well in How Joe Biden Could Help Internet Companies Moderate Harmful Content. The headline is lousy, it should be something like “The pros and cons of Section 230 and some plausible things to do to improve the situation.” Few Internet-related subjects are more important.

Few subjects, generally, are more important than that of truth and lies and how our widespread failure to discern between them is driving most of our important public pathologies. Jonathan Rauch, back in 2018 (but I didn’t notice it then) refers to this as “the problem of social epistemology” and in The Constitution of Knowledge, has really a lot of smart things to say on the subject. To start with: It’s not that, among educated people, we have huge genuine disagreements as to what the truth is; it’s that 21st-century conservatives have discerned that if they entirely abandon any regard for truth, they can score valuable political points by weaponizing falsehood at Internet scale. This piece is big and smart and eloquent. I quote: “There is nothing new about disinformation. Unlike ordinary lies and propaganda, which try to make you believe something, disinformation tries to make you disbelieve everything.”

Speaking of big lies, among the biggest are those that are driving the current Bitcoin bubble. In that menagerie of whoppers, among the biggest are those surrounding Tether. Patrick McKenzie offers Tether: The Story So Far which tells the awful truth (really, it is) in entertaining detail. If you haven’t already dumped your Bitcoin, you will after reading this.

Not all the big lies are about money. For example, QAnon. Reed Berkowitz’s A Game Designer’s Analysis Of QAnon tries, not to explain QAnon because who could, but to examine some of the dynamics of how and why it survives and infests so many minds. Useful.

The antidote to lies should be facts. But that doesn’t seem to be working well. Given that, Elizabeth Kolbert’s Why Facts Don’t Change Our Minds is obviously highly material to our current predicament. I’m not going to try to summarize because it respects the complexity of the subject and doesn’t hurry up in an effort to produce a sound bite. It’s full of quantitative social science; one researcher is quoted as saying “ways of thinking that now seem self-destructive must at some point have been adaptive.” Yep.

Look, I acknowledge that expressions of concern about Facebook and its ilk are not exactly new. But Adrienne LaFrance’s Facebook is a Doomsday Machine provide an excellent overview of the problem. I quote: “The social web is doing exactly what it was built for.” One really refreshing notion offered here is that Facebook, and social media generally, are just too freaking big. I can think of things that could be done about that.

Eric Alterman has been an intelligent, acerbic voice of the Left (and a really good rock-music critic) for many many years. During the last twenty-five of those years he’s been the media critic for The Nation. Look Beyond the Media Frenzy and Focus on the Fundamentals is his farewell column and it says things we need to be listening to. He’s not neutral or balanced at all: “If we look beneath the surface of our elections, we see a culture of plutocracy that has enabled the creation of an autocracy based on a foundation of purposeful dishonesty”.

I’ve long been interested in the economics of Internet publishing, and so should anyone who’s interested in the quality of our intellectual discourse as a civilization. Talking Points Memo is a twenty-year-old progressive-political blog that has morphed into a viable company and stayed alive, which is more than you can say about most such startups. Part of their 20th-anniversary celebration, The Business of TPM looks at how and why they survived while so many others didn’t.

Now for something much lighter-hearted: A fairy tale! No really, with a Cinderella-meets-Game-of-Thrones flavor: Stepsister. The way I found this is by impulse-grabbing a recent issue of Fantasy and Science Fiction from the library, then reading and recommending a nice story by Leah Cypess, which led me to her website, and thence to this story. Enjoy!

Speaking of stories, someone is writing a real-time alternate history of the future on Twitter, called Real-Time WW3 from 2033. It’s a little awkward, you have to start by scrolling all the way to the bottom then working your way back up. Some parts of the story fail my suspension-of-disbelief test, but a whole lot of it is clever, and I find the style vivid.

Late-2020 EV Charging in Canada 30 Dec 2020, 8:00 pm

I haven’t seen my 90-year-old mother since January. I guess by mid-year both she and I are likely to be vaccinated so maybe I could go visit. I’d rather drive than fly. What with Covid I’ve been cooped up so long I could scream, so there are few things I’d rather do than get out on the highway. As a displacement activity I’ve been working out how I could get the electric Jaguar 1734 kilometres from Vancouver to Saskatchewan to see her. Thus this quick survey of the state of the infrastructure in Western Canada, and also trip-planning tools. Some of this info might be useful elsewhere than in Western Canada.


For this to make sense, you have to realize that while Canada looks like a big tall country on the map, it’s actually a short wide country, mostly cuddled up against the USA, nine thousand kilometres wide and with most people living 500km or less from the border. So for purposes of charging cars, if you have good coverage of Route 1, the Trans-Canada Highway, you’ve solved a big piece of the problem. My drive from Vancouver to Regina, Saskatchewan is along that road all the way.

Back when I got the Jag in January 2019, it’d have been generous to call the coverage spotty. The government of BC, the westernmost province, had scattered fast chargers here and there around the highways, but they were only 50kW (just barely “fast”) and the reliability was poor. Once you got out of BC and onto the Prairies, the story got worse fast.

Now there are two outfits trying to build out good cross-country coverage: Petro-Canada, the gas-station brand of Suncor, a biggish oil company, and Electrify Canada, an outgrowth of Electrify America, a subsidiary of Volkswagen that they launched in 2016 in the wake of their lying-about-emissions scandal to wave the green flag. The Electrify Canada chargers are located at Canadian Tire, an ubiquitous Canadian hardware big-box which specializes in automotive stuff.

Both networks have mobile apps to facilitate charging, and both charge a flat 27¢/minute. Which is a lot more expensive than the kWh I soak up from my Level-2 charger at home, but still plenty cheaper than gas.

It’s at least somewhat true that at this point both of these operations are aspirational, more of an attempt to make a point than make a buck. But EV sales are ramping up and will ramp faster, so I think it’ll prove to have been a smart move.

Both operate modern chargers advertised as offering 200-350kW of juice. Some of the most recent cars, like the Porsche Taycan, are said to be able to make good use of that power and charge amazingly fast. My Jaguar I-Pace can only soak up 100kW and not for that long, it pretty quickly starts backing off and settles somewhere in the 85kW range.

In an online I-Pace discussion group one guy said that at Petro-Canada, he went from 20% to 80% in 45 minutes, which I’d say is adequate, while another one said it took 52 minutes to get from 50% to 94%. These reports are consistent, because as far as I know all electric car charging slows down dramatically at about the 80% point. So it’s considered polite and efficient to charge up to 80% and then move along.


The networks have their own maps: Here’s Petro-Canada’s and here’s Electrify Canada’s. At the moment Petro-Canada is well ahead; Watch out though, their map includes chargers that are planned but not running yet.

If you want to plan a trip, I’ve run across two useful tools. First is PlugShare, which I’ve been pretty impressed with. You make an account and tell it what kind of car you have and apply filters to the kind of chargers you want to see — for the purposes of a trans-Canada trip, that means “Level 3” DC fast chargers only. Then it has tools to plan a trip — you have to pick chargers by hand and add them. It’ll export the results to Google Maps and do a bunch of other useful stuff. Here’s how it looks.

EV trip plan by PlugShare

Watch out, this map and the other one are huge so you can read the small print; you might have to pan around to see them, particularly on a small laptop screen.

One nice thing about PlugShare is that it’s got a community thing going, you can click on any charger on the map and see reports from people who’ve charged up recently, on how to find it and how it’s working. There’s an average rating, that’s the little green number badge on each stop.

The other option is something called A Better Routeplanner (let’s say “ABRp”), which is a whole lot more automated and has significant pluses and minuses. By “automated” I mean you just put in your start and destination and it picks all the driving legs and charging stops. It picked almost exactly the same set that I picked by hand in PlugShare. First of all, here’s its map.

EV trip plan A Better Routeplanner

The readout is slicker, it estimates how much charge you’ll need and how long it’ll take at each stop. But there are problems. First of all, it recommends using the Petro-Canada charger in Salmon Arm, but when you click on that, the info-box says it isn’t actually operational yet (PlugShare agrees) — but there’s an older-fashioned 50kW charger there (which PlugShare knows about). Electrify Canada says theirs is coming soon too, so maybe it’ll be fast by the time we can travel again.

I also heard from someone who’s used it that the charging-time estimates are pretty inaccurate, a lot depends on the specifics of charge level, temperature, and apparently the current aspect of Jupiter. For what it’s worth, on this trip it estimated 21 hours of driving and 5 hours of charging.

Now here’s the weird thing. ABRp will give you a URL for your route plan: here’s mine. If you click on that, it’ll take some time, apparently it recomputes the whole thing. What’s weird is that I get different maps in Chrome and Safari. In Chrome, it doesn’t seem to know about the Petro-Can charger in Medicine Hat — thus the orange-line route segment, where it’s telling me that I have to drive slow to conserve power.

Both PlugShare and ABRp have mobile apps that look OK at first glance.


The charging infrastructure in Western Canada is pretty good, or at least will be once they get Salmon Arm filled in. The planning tools are just fine. I can’t wait to get on the road this summer. And to see my Mom.

2020 in a Difficult Lens 27 Dec 2020, 8:00 pm

I’ve said this before in passing, but I’m becoming passionate about it. Increasingly, I believe that if you go out for a walk with a camera, you should consider attaching a difficult, opinionated lens and just leaving it on. Herewith a gallery of ten 2020 photos taken with such a lens, interspersed with preaching on the subject.

A windy winter day on English Bay

A windy March 15th on English Bay, the part of the Pacific closest to Vancouver and what you find yourself in when your boat goes under the bridge and out of protection. It is frequently the subject of grumpy remarks from boaters concerning its tendency to nasty crosscutting waves.

The opinionated lens in all these pictures is the Samyang 135m f/2 (actually 148mm on a Fuji X-cam), without doubt my most rewarding camera purchase in recent years. Follow that link for lots more on the lens.

That picture above shows how an unreasonably long lens can layer mid-range and distant objects in a composition that ordinary lenses can’t but your eyes think they can — they can’t actually, but they switch focus fast enough to fool you.

Purple behind poppies Weathered fence behind purple

Purple behind poppies, and purple in front of a weathered fence. May 10th and 19th respectively.

And of course the big bright f/2.0 does that sharp-focus/soft-field thing that makes your subject stand out and your background background-y. I can’t even remember what the purple behind the orange was, but if you’d been able to see the individual blossoms they would have got in the way of appreciating the poppies. And the weathered grey cedar fence with a diamond lattice behind the pink flower is nothing special to look at when it’s in focus.

When I’m out walking with the big Fuji/Samyang combo, that’s not the only camera I have with me, because the phone’s always there. And given the right subject, it can take pictures that are just as beautiful as the “real” camera.

In fact, the distinguishing feature of the big fat prime is that every single one of the photos it takes is something that couldn’t have been captured with any phone.

Distant mountainside

Of course, another reason to wield big glass is to capture things that are long away away. Those trees are at least ten kilometers across Howe Sound.

Party boat!

Party boat! Maybe 500m away.


The heron just far enough away that it doesn’t mind my presence.

The last three pictures were captured on the first and second of August. They’re all pretty heavily processed in Lightroom, especially the mountainside. Lightroom’s “dehaze” control is super helpful on almost any photo of something that’s far away, because it’s specifically designed to counteract the effect of a whole lot of air between you and your target.

purple berries

Late-season berries, November 21st.

Once again with the shallow depth of field. And now, once again with the layering, which in this case gracefully collapses a city’s texture and topography into a little rectangle.

Downtown from across Jonathan Rogers Park Dog in Jonathan Rogers Park

From 8th Avenue in Mount Pleasant by Jonathan Rogers Park, downtown across False Creek and a dog playing fetch, both on December 26th.

The big bright glass lets you capture something that’s moving fast on a dull day while still usefully blurring the background.

Now, there are downsides. This is big and heavy and ridiculously out of proportion with the Fujifilm X-T30 it’s screwed onto, unsurprising and sort of OK since I bought the camera to carry along while climbing up and down the Great Wall of China.

And of course it’s manual focus, which makes everything more difficult.

Do Not Refuse

December 27th in Musqueam Park. You can figure out what the sign originally said.

Indeed, I wouldn’t normally take this thing along on an adventure hike. But it seems I never regret setting out with it when I do. Like the sign now says, DO NOT REFUSE.

Hot Winter Tabbouleh 23 Dec 2020, 8:00 pm

This is a recipe I dreamed up that has pleased the family twice now. It’s pretty easy to make and has lots of room for creative variation. The name is probably controversial. Let me lead off with a picture.

Hot Winter Tabbouleh

Wikipedia says Tabbouleh is Levantine but I grew up in Lebanon and I’ve never had really first-rate Tabbouleh anywhere else, except for the version my Mom makes. Tabbouleh is a salad centered around parsley, bulgur, and tomatoes.

What happened was, I had bacon that I was going to make breakfast with but didn’t, and it was my turn to make dinner. I like dicing bacon into little pieces and sautéing them until they’re golden, then they work great in pasta sauces and other settings. I was looking at the bacon blankly and got a sudden mental image of a dish with the bacon amid diced crunchy greens and flashes of red, like tabbouleh.


Sorry, not terribly quantitative.

  1. Thick-cut strongly-smoked bacon. Eight to twelve slices feeds four.

  2. Kale. Thick and crunchy is better. One big bunch is enough, but generally more is better.

  3. Bulgur. A cup or so.

  4. Sun-dried tomatoes. Two or three big dessert-spoonfuls.

  5. Other items, just for fun (see below).


This doesn’t move along that fast so there’s time to make whatever else you have in mind.

  1. Put the bulgur in to soak. About twice as much water as bulgur.

  2. Dice up the bacon, I try for pieces about the size of cornflakes.

  3. Toss the bacon into a frying pan — I swear by our old black well-seasoned cast-iron pan — at medium heat so it’s spitting but not loudly, and ignore it for a while, stirring every so often. It takes me fifteen minutes or more to to get that golden color.

  4. Dice up your kale. In the picture above, I got distracted halfway through and it’s nowhere near finely enough diced. Still tasted fine but looks better if finer.

  5. When the bacon is looking ready, drain off most but not all of the excess fat.

  6. By the time you’ve done all this, the bulgur should have soaked up the water and be pretty soft. Drain off any remaining water and add the bulgur to the pan. Also add the sun-dried tomatoes.

  7. Some fairly vigorous stirring is required to get it all mixed up evenly.

  8. The soaked bulgur is now pleasantly al dente so you only need to cook this stuff for a couple of minutes until it’s nice and hot.

  9. Add your kale and once again it’ll take enthusiasm to get it all nicely mixed up.

I like to leave it in the frying pan and put that (on a trivet or whatever of course) on the table, let people serve themselves.


The first time I did I didn’t do the sun-dried tomatoes and it was still tasty, the kale/bacon/bulgur play very well together. But the tomatoes were nice enough to become canonical. I also tossed in some kalamata olives, cut up a bit, and that was good too.

For future iterations I’m thinking of ideas including lemon juice and diced sweet potato.

The name

Lebanese and Levantines generally should be forgiven if they are recoiling in horror because this is really not anything like Tabbouleh, which by definition is cool and crisp. But I can’t think of anything better.

Long Links 1 Dec 2020, 8:00 pm

Welcome to the Long Links offering for November 2020, in which I take advantage of my lightly-employed status to recommend a list of long-form works that I had time to consume, acknowledging that while you probably don’t, one or two of them might reward the time it would take you to absorb. This month’s highlights: Election rear-views, Siberia, blueswomen, the Orlando NBA bubble, and a lovely lecture about software and music.

As you might have heard, there’s this large Anglophone country south of Canada where they had an election last month. The next few long-form recommendations consider what it might have meant.

Let’s start with Rebecca Solnit’s On Not Meeting Nazis Halfway, a pretty hardass take: “If half of us believe the earth is flat, we do not make peace by settling on it being halfway between round and flat.”

More: What Trump Showed Us About America offers short pieces from 35 “thinkers” (what’s one of those?) and the subject is obviously important. Some are lightweight but plenty isn’t.

More: How Trump Changed America is by Clare Malone over at Five Thirty Eight, and is unironically heartfelt. She deploys the words “hollow” and “hollowness” in considering the Trump years and I think this points at several important truths. There’s a lot of good writing and clear thinking in here. I quote: “So what is America after Trump? A nation figuring out how — and whether — to engage and whom to love: the stranger or the self? I know the cynic’s prediction of which we’ll choose, but pure cynicism is boring.” Yeah.

More, still at Five Thirty Eight: Are Blowout Presidential Elections A Thing Of The Past?, by Geoffrey Skelley, isn’t actually a long-form piece but raises a long-form problem: The 2020 US Presidential election was the ninth consecutive one in which the popular-vote margin was less than 10%. Which is numerically weird. It wouldn’t be crazy to think that there’d be random variation in the margin, but if that were true, you’d expect one candidate to win double their opponent’s vote two-thirds of the time (do the math, it isn’t hard). What are the forces that systematically and repeatedly make these elections so close? I don’t know and neither does the author. But I’d like to.

More: I’m a big fan of Peter Beinart, especially his writings about Israel and its neighbors and problems, full of clear vision and obviously on a subject that requires extreme courage. In How Trump Lost he makes the point (as have others) that if Trump had actually delivered the program he ran on in 2016 he’d probably have won in 2020. Progressives like me ignore the attraction of alt-right populism at our nations’ peril, and we can’t always count on its having standard-bearers who are clueless buffoons.

More: Joe Biden Must Be a President for America’s Workers offers a straightforward argument that the new administration needs to start with inequality and the increasingly dire position of blue-collar America, and offers practical suggestions on how to do that. I can hear voices muttering “class reductionist!” but I don’t care, I agree with pretty well every word.

No more!

Enough about that election. Let’s move on…

In the New York Times, Along Russia’s “Road of Bones”, Relics of Suffering and Despair tries to wrap words and pictures around the unimaginably vast and harsh expanses of Eastern Siberia. I learned one time, on an endless pain-filled flight from Tokyo to Paris, that there remains a part of the planet where you can look out of a plane for an hour at a time and see no marks left by any human, and ever since then I’ve been fascinated by Siberia. But I don’t think you need such a fascination to enjoy this piece.

The Most Magical Place on Earth (in GQ, forsooth) is by Taylor Rooks, a Black woman sportswriter whom I’d never previously encountered. This is about life in the Orlando Disney NBA anti-Covid bubble and I really enjoyed it. When you drop the NBA players and staff and coaches and media into three hotels for a few months and seal them off from the rest of the world, they all get to know each other in new and interesting ways. Add drama following on George Floyd’s murder, and then there’s the general Covid-angst backdrop, and it all adds up to a very engaging read. I’m going to have to track down Ms Rooks and read more.

Inside YouTube’s plan to win the music-streaming wars: The title says it all. I am currently a YouTube Music customer, and enjoying it as the service learns to understand my admittedly unusual tastes. But this made me sad, because the music world (actually, the world in general) does not need another extrusion of the vast Google amoeba oozing in and sucking the life and profit out.

The Truth Is Paywalled But The Lies Are Free — subtitled “The Political Economy of Bullshit” is, like the last Long Link, exactly what the title says, and correspondingly sad, at least initially. But it’s a refreshing read, takes a serious detailed look at the shape of the problem and offers lots of practical think-big ideas about useful paths forward. Also, the closing line is “The truth needs to be free and universal”, a sentiment that should warm many hearts.

The future of work is written is by Juan Pablo Buriticá at increment — hadn’t previously heard of either the author or the site. It addresses the future of remote work and since so has everyone else you might resist the temptation to read it. But check out the title: Buriticá dives deep on the strengths of collaboratively-written asynchronous communication in the remote-work construct. He considers the IETF in general and the “RFC” notion in particular. I’m obviously sympathetic to that, and my most recent employer thrived on written communication and document-based decision making, so I think there’s a lot to study here.

The Rise and Fall of Getting Things Done looks at the issue of what “Productivity” means for knowledge workers, and how to improve it. This field of study was invented by Peter Drucker who gets some attention here, but the piece looks most closely at Merlin Mann, the inventor of “Inbox Zero” (I personally prefer the low-stress inbox approach) and the term “productivity pr0n”. There’s reason to suspect that trying to do anything useful in the face of an overwhelming flow of input, much of it claiming to be high-priority, is a fool’s errand, and part of the solution has to involve disconnection. Useful!

There’s this person named Paul Ford whom I’ve known about for years, he seems to earn his living doing technology but in my mind he’s a writer, and when I see something new from him, I’m inclined to stop whatever I’m doing and read it right then. Tech After Trump is what it says on the label and I find little to disagree with. In Web Conversation From the Other Side Paul imagines a dialogue between the technology of 2000 and that of 2020. Probably mostly of interest to technologists, but really extreme interest for a lot of us.

If you’re convinced (like me) that monopolization is a central problem afflicting many sectors of our economies and wondering (like me) what to do about it, the problem of defining the term “market share” becomes very important. Benedict Evans’ Market definitions and tech monopolies addresses this question. Unfortunately he offers more questions than answers, but they are good, challenging questions.

Now let’s close with a few musical offerings. I really enjoyed Women created the blues. Now they are taking it back, and it led to the discovery of some terrific music, notably by Samantha Fish. I have big soft spots for woman blues singers and loud electric guitar and if the woman’s also bashing the guitar well what’s not to like?

Not all the good guitarists are women. But my mental list of Guitar Players That Matter had never really included George Harrison even though I kinda liked that band he played in and he wrote one of my favorite songs. But Notes You Never Hear: The Metaphysical Loneliness of George Harrison argues that I’m wrong, that his contribution to all those records we’ve all heard so much is a Really Big Deal. I’m not sure I’m convinced, if I were launching a fantasyland rock&roll band and could only hire one Beatles guitarist it probably wouldn’t be George. But I still enjoyed this a lot.

Finally, I offer a 51-minute lecture by Alice Eldridge, a keynote at the “AI Music Creativity 2020” conference. It’s beautiful albeit annoyingly academic in that it keeps referring to other scholars by name and assuming we know who they are. Did I say beautiful? And extremely intellectually stimulating. She makes heavy use of the gerund “musicking” which I kind of like. There is geek mind-candy including a few flashes of source code and exotic music hardware but that’s not why you should watch this, you should sit down with an adult beverage when you’ve nothing else to distract you and listen carefully to what Dr. Eldridge has to say because every word carries carefully-considered weight and many of them are also beautiful.

Page processed in 0.853 seconds.

Powered by SimplePie 1.3.1, Build 20200602154528. Run the SimplePie Compatibility Test. SimplePie is © 2004–2021, Ryan Parman and Geoffrey Sneddon, and licensed under the BSD License.