The details are fascinating, but the central argument — that the birth of modernity can be traced to a meta-crisis spawned by the 0.1s problem — is worth understanding and appreciating whether or not you’re a time nerd like me.
There is no convenient leitmotif, comparable to the 0.1s problem, for our contemporary version of the rhyming conditions, but something very similar to the “tenth of a second crisis” is going on today. I suspect our Great Weirding too involves some sort of limiting factor on human cognition that we haven’t yet properly wrapped our minds around. It isn’t reaction time, but something analogous.
Whilst Feature Parity often sounds like a reasonable proposition, we have learnt the hard way that people greatly underestimate the effort required, and thus misjudge the choice between this and the other alternatives. For example even just defining the 'as is' scope can be a huge effort, especially for legacy systems that have become core to the business.
Most legacy systems have 'bloated' over time, with many features unused by users (50% according to a 2014 Standish Group report) as new features have been added without the old ones being removed. Workarounds for past bugs and limitations have become 'must have' requirements for current business processes, with the way users work defined as much by the limitations of legacy as anything else. Rebuilding these features is not only waste it also represents a missed opportunity to build what is actually needed today. These systems were often defined 10 or 20 years ago within the constraints of previous generations of technology, it very rarely makes sense to replicate them 'as is'.